You are currently browsing the tag archive for the ‘Data centric’ tag.

 

Earth GIF - Find & Share on GIPHY

At this moment we are in the middle of the year. Usually for me a quiet time and a good time to reflect on what has happened so far and to look forward.

Three themes triggered me to write this half-year:

  • The changing roles of (PLM) consultancy
  • The disruptive effect of digital transformation on legacy PLM
  • The Model-driven approaches

A short summary per theme here with links to the original posts for those who haven’t followed the sequence.

The changing roles of (PLM) consultancy

Triggered by Oleg Shilovitsky’s post Why traditional PLM ranking is dead. PLM ranking 2.0 a discussion started related to the changing roles of PLM choice and the roles of a consultant.  Oleg and I agreed that using the word dead in a post title is a way to catch extra attention. And as many people do not read more than the introduction, this is a way to frame ideas (not invented by us, look at your newspaper and social media posts).  Please take your time and read this post till the end.

Oleg and I concluded that the traditional PLM status reports provided by consultancy firms are no longer is relevant. They focus on the big vendors, in a status-quo and most of them are 80 % the same on their core PLM capabilities. The challenge comes in how to select a PLM approach for your company.

Here Oleg and I differ in opinion. I am more looking at PLM from a business transformation point of view, how to improve your business with new ways of working. The role of a consultant is crucial here as the consultant can help to formalize the company’s vision and areas to focus on for PLM. The value of the PLM consultant is to bring experience from other companies instead of inventing new strategies per company. And yes, a consultant should get paid for this added value.

Oleg believes more in the bottom-up approach where new technology will enable users to work differently and empower themselves to improve their business (without calling it PLM). More or less concluding there is no need for a PLM consultant as the users will decide themselves about the value of the selected technology. In the context of Oleg’s position as CEO/Co-founder of OpenBOM, it is a logical statement, fighting for the same budget.

The discussion ended during the PLMx conference in Hamburg, where Oleg and I met with an audience recorded by MarketKey. You can find the recording Panel Discussion: Digital Transformation and the Future of PLM Consulting here.
Unfortunate, like many discussions, no conclusion. My conclusion remains the same – companies need PLM coaching !

The related post to this topic are:

 

The disruptive effect of digital transformation on legacy PLM

A topic that I have discussed the past two years is that current PLM is not compatible with a modern data-driven PLM. Note: data-driven PLM is still “under-development”. Where in most companies the definition of the products is stored in documents / files, I believe that in order to manage the complexity of products, hardware and software in the future, there is a need to organize data related to models not to files. See also: From Item-centric to model-centric ?

For a company it is extremely difficult to have two approaches in parallel as the first reaction is: “let’s convert the old data to the new environment”.

This statement has been proven impossible in most of the engagements I am involved in and here I introduced the bimodal approach as a way to keep the legacy going (mode 1) and scale-up for the new environment (mode 2).

A bimodal approach is sometimes acceptable when the PLM software comes from two different vendors. Sometimes this is also called the overlay approach – the old system remains in place and a new overlay is created to connect the legacy PLM system and potentially other systems like ALM or MBSE environments. For example some of the success stories for Aras complementing Siemens PLM.

Like the bimodal approach the overlay approach creates the illusion that in the near future the old legacy PLM will disappear. I partly share that illusion when you consider the near future a period of 5 – 10+ years depending on the company’s active products. Faster is not realistic.

And related to bimodal, I now prefer to use the terminology used by McKinsey: our insights/toward an integrated technology operating model in the context of PLM.

The challenge is that PLM vendors are reluctant to support a bimodal approach for their own legacy PLM as then suddenly this vendor becomes responsible for all connectivity between mode 1 and mode 2 data – every vendors wants to sell only the latest.

I will elaborate on this topic during the PDT Europe conference in Stuttgart – Oct 25th . No posts on this topic this year (yet) as I am discussing, learning and collecting examples from the field. What kept me relative busy was the next topic:

The Model-driven approaches

Most of my blogging time I spent on explaining the meaning behind a modern model-driven approach and its three main aspects: Model-Based Systems Engineering, Model-Based Definition and Digital Twins. As some of these aspects are still in the hype phase, it was interesting to see the two different opinions are popping up. On one side people claiming the world is still flat (2D), considering model-based approaches just another hype, caused by the vendors. There is apparently no need for digital continuity. If you look into the reactions from certain people, you might come to the conclusion it is impossible to have a dialogue, throwing opinions is not a discussion..

One of the reasons might be that people reacting strongly have never experienced model-based efforts in their life and just chime in or they might have a business reason not to agree to model-based approached as it does not align with their business? It is like the people benefiting from the climate change theory – will the vote against it when facts are known ? Just my thoughts.

There is also another group, to which I am connected, that is quite active in learning and formalizing model-based approaches. This in order to move forward towards a digital enterprise where information is connected and flowing related to various models (behavior models, simulation models, software models, 3D Models, operational models, etc., etc.) . This group of people is discussing standards and how to use and enhance them. They discuss and analyze with arguments and share lessons learned. One of the best upcoming events in that context is the joined CIMdata PLM Road Map EMEA and the PDT Europe 2018 – look at the agenda following the image link and you should get involved too – if you really care.

 

And if you are looking into your agenda for a wider, less geeky type of conference, consider the PI PLMx CHICAGO 2018 conference on Nov 5 and 6. The agenda provides a wider range of sessions, however I am sure you can find the people interested in discussing model-based learnings there too, in particular in this context Stream 2: Supporting the Digital Value Chain

My related posts to model-based this year were:

Conclusion

I spent a lot of time demystifying some of PLM-related themes. The challenge remains, like in the non-PLM world, that it is hard to get educated by blog posts as you might get over-informed by (vendor-related) posts all surfing somewhere on the hype curve. Do not look at the catchy title – investigate and take time to understand HOW things will this work for you or your company. There are enough people explaining WHAT they do, but HOW it fit in a current organization needs to be solved first. Therefore the above three themes.

Advertisements

This is my concluding post related to the various aspects of the model-driven enterprise. We went through Model-Based Systems Engineering (MBSE) where the focus was on using models (functional / logical / physical / simulations) to define complex product (systems). Next we discussed Model Based Definition / Model-Based Enterprise (MBD/MBE), where the focus was on data continuity between engineering and manufacturing by using the 3D Model as a master for design, manufacturing and eventually service information.

And last time we looked at the Digital Twin from its operational side, where the Digital Twin was applied for collecting and tuning physical assets in operation, which is not a typical PLM domain to my opinion.

Now we will focus on two areas where the Digital Twin touches aspects of PLM – the most challenging one and the most over-hyped areas I believe. These two areas are:

  • The Digital Twin used to virtually define and optimize a new product/system or even a system of systems. For example, defining a new production line.
  • The Digital Twin used to be the virtual replica of an asset in operation. For example, a turbine or engine.

Digital Twin to define a new Product/System

There might be some conceptual overlap if you compare the MBSE approach and the Digital Twin concept to define a new product or system to deliver. For me the differentiation would be that MBSE is used to master and define a complex system from the R&D point of view – unknown solution concepts – use hardware or software?  Unknown constraints to be refined and optimized in an iterative manner.

In the Digital Twin concept, it is more about a defining a system that should work in the field. How to combine various systems into a working solution and each of the systems has already a pre-defined set of behavioral / operational parameters, which could be 3D related but also performance related.

You would define and analyze the new solution virtual to discover the ideal solution for performance, costs, feasibility and maintenance. Working in the context of a virtual model might take more time than traditional ways of working, however once the models are in place analyzing the solution and optimizing it takes hours instead of weeks, assuming the virtual model is based on a digital thread, not a sequential process of creating and passing documents/files. Virtual solutions allow a company to optimize the solution upfront instead of costly fixing during delivery, commissioning and maintenance.

Why aren’t we doing this already? It takes more skilled engineers instead of cheaper fixers downstream. The fact that we are used to fixing it later is also an inhibitor for change. Management needs to trust and understand the economic value instead of trying to reduce the number of engineers as they are expensive and hard to plan.

In the construction industry, companies are discovering the power of BIM (Building Information Model) , introduced to enhance the efficiency and productivity of all stakeholders involved. Massive benefits can be achieved if the construction of the building and its future behavior and maintenance can be optimized virtually compared to fixing it in an expensive way in reality when issues pop up.

The same concept applies to process plants or manufacturing plants where you could virtually run the (manufacturing) process. If the design is done with all the behavior defined (hardware-in-the-loop simulation and software-in-the-loop) a solution has been virtually tested and rapidly delivered with no late discoveries and costly fixes.

Of course it requires new ways of working. Working with digital connected models is not what engineering learn during their education time – we have just started this journey. Therefore organizations should explore on a smaller scale how to create a full Digital Twin based on connected data – this is the ultimate base for the next purpose.

Digital Twin to match a product/system in the field

When you are after the topic of a Digital Twin through the materials provided by the various software vendors, you see all kinds of previews what is possible. Augmented Reality, Virtual Reality and more. All these presentations show that clicking somewhere in a 3D Model Space relevant information pops-up. Where does this relevant information come from?

Most of the time information is re-entered in a new environment, sometimes derived from CAD but all the metadata comes from people collecting and validating data. Not the type of work we promote for a modern digital enterprise. These inefficiencies are good for learning and demos but in a final stage a company cannot afford silos where data is collected and entered again disconnected from the source.

The main problem: Legacy PLM information is stored in documents (drawings / excels) and not intended to be shared downstream with full quality.
Read also: Why PLM is the forgotten domain in digital transformation.

If a company has already implemented an end-to-end Digital Twin to deliver the solution as described in the previous section, we can understand the data has been entered somewhere during the design and delivery process and thanks to a digital continuity it is there.

How many companies have done this already? For sure not the companies that are already a long time in business as their current silos and legacy processes do not cater for digital continuity. By appointing a Chief Digital Officer, the journey might start, the biggest risk the Chief Digital Officer will be running another silo in the organization.

So where does PLM support the concept of the Digital Twin operating in the field?

For me, the IoT part of the Digital Twin is not the core of a PLM. Defining the right sensors, controls and software are the first areas where IoT is used to define the measurable/controllable behavior of a Digital Twin. This topic has been discussed in the previous section.

The second part where PLM gets involved is twofold:

  • Processing data from an individual twin
  • Processing data from a collection of similar twins

Processing data from an individual twin

Data collected from an individual twin or collection of twins can be analyzed to extract or discover failure opportunities. An R&D organization is interested in learning what is happening in the field with their products. These analyses lead to better and more competitive solutions.

Predictive maintenance is not necessarily a part of that.  When you know that certain parts will fail between 10.000 and 20.000 operating hours, you want to optimize the moment of providing service to reduce downtime of the process and you do not want to replace parts way too early.


The R&D part related to predictive maintenance could be that R&D develops sensors inside this serviceable part that signal the need for maintenance in a much smaller time from – maintenance needed within 100 hours instead of a bandwidth of 10.000 hours. Or R&D could develop new parts that need less service and guarantee a longer up-time.

For an R&D department the information from an individual Digital Twin might be only relevant if the Physical Twin is complex to repair and downtime for each individual too high. Imagine a jet engine, a turbine in a power plant or similar. Here a Digital Twin will allow service and R&D to prepare maintenance and simulate and optimize the actions for the physical world before.

The five potential platforms of a digital enterprise

The second part where R&D will be interested in, is in the behavior of similar products/systems in the field combined with their environmental conditions. In this way, R&D can discover improvement points for the whole range and give incremental innovation. The challenge for this R&D organization is to find a logical placeholder in their PLM environment to collect commonalities related to the individual modules or components. This is not an ERP or MES domain.

Concepts of a logical product structure are already known in the oil & gas, process or nuclear industry and in 2017 I wrote about PLM for Owners/Operators mentioning Bjorn Fidjeland has always been active in this domain, you can find his concepts at plmPartner here  or as an eLearning course at SharePLM.

To conclude:

  • This post is way too long (sorry)
  • PLM is not dead – it evolves into one of the crucial platforms for the future – The Product Innovation Platform
  • Current BOM-centric approach within PLM is blocking progress to a full digital thread

More to come after the holidays (a European habit) with additional topics related to the digital enterprise

 

Model-based continued: Model-Based Definition

After a short celebration, 10 years blogging and 200 posts, now it is time to continue my series related to the future of model-based. So far my introduction and focus on the bigger picture of the term Model-Based has led to various reactions. In particular, related to Model-Based Definition, the topic I am going to discuss in this post. Probably this is the topic where opinions vary the most as it is more close to the classical engineering and manufacturing processes.

What is Model-Based Definition?

There are various definitions of the term Model-Based Definition. Often the term Model-Based Enterprise is used in the same context. Where some people might stop thinking because the terminology is not 100 % aligned, I propose to focus on content. Let’s investigate what it is.

In the classical product lifecycle, a product is first designed for its purpose based on specifications. The product can be simple, purely mechanical or more complex, requiring mechanical design, electronic components, and software to work together. For the first case, I will focus on Model-Based definition, for the second case I recommend to start reading about Model-Based Systems Engineering approaches where the mechanical design is part of a more complex system.

Model-Based Definition for Mechanical Designs – the role of 2D

Historically designs were done on the drawing board in 2D. After the introduction of 2D CAD and later affordable 3D CAD systems at the end of the previous century, companies made a shift from designing in 2D towards 3D.  The advantages were clear. A much better understanding of products. Reading a 2D drawing requires special skills and sometimes they were not unambiguous. Therefore, 3D CAD models lead to increased efficiency and quality combined with the potential to reuse and standardize parts or sub-assemblies in a design.

These benefits were not always observed as complementary to the design (the engineering point of view), there was still the need to describe and define how a product needs to be manufactured. The manufacturing definition remained in a set of 2D drawings, and the 2D Drawings were the legal authority describing the product.

An interesting side note observation:
You will still see in industrial machinery companies, a pure EBOM does not exist, as designs were made to target the manufacturing drawings, not the 3D Model, engineering focused, intent. In this type of companies, the discussion EBOM/MBOM is challenging to explain.

Once the 3D Model becomes the authority, the split between design and manufacturing information will create extra work if you keep on creating 2D drawings for manufacturing.  It requires non-value added extra work, i.e., reinterpreting 3D data in 2D formats (extra engineering hours) and there is the risk for new errors (interpretations/versioning issues). This non-value added engineering time can add up to over 30 percent of the time spent by engineering. You can find these numbers through the links below this post. I will not be the MBD teacher in this post, I will focus on the business impact.

Model-Based Definition based on 3D

3D PDF Model

The logical step is to use the 3D Model and add manufacturing information attached to the model, through different views.  This can be Geometric Dimensioning and Tolerancing information (GF&T), Quality measurement information, Assembly instructions and more, all applied to different views of the model.

 

Of course here you become dependent on the chosen environments that support the combination of a 3D CAD model combined with annotation views that can be selected in the context of the model. There are existing standards how to annotate a model, find your most practical standard to your industry / Eco-system. Next, most CAD vendors and PLM vendors have their proprietary 3D formats and when you stay within their solution range working with a model-based definition will bring direct benefits, however …..

Model-Based Definition data standards

Every company needs to be able to combine and share information internally with other teams or with partners and suppliers, so a single vendor solution is a utopia. Even if your company has standardized themselves to one system, the next acquisition might be disturbing this dream. Anticipating for openness is crucial and when you start working according to a model-based definition, make sure that at least you have import or export capabilities from within your environment towards model-based definition standards.

The two major standards for model-based definition are 3DPDF and AP242/JT based. Don’t expect these standards to be complete. They will give you a good foundation for your model-based journey and make sure you are part of this journey. (Listen to the CIMdata webinar also listed below)

The Model-Based journey

It took almost 20 years for 3D CAD to become the mainstream for mechanical design. Engineers are now trained in 3D and think in 3D. Now it is time to start the journey to abandon 2D and connect engineering, manufacturing and service more efficient. Similar gains can be expected. Follow the links below this article, here already a quote from an old post by Isha Gupta Ray (Capgemini) related to MBD:

MBE Drivers: The need for consumption of 3D product data by non-engineering departments and the elimination of 2D drawing related rework and costs are driving companies to adopt 3D MBE methods rapidly. DoD predicts that the move away from 2D Drawings and into open and free-to-view 3D MBE documents will reduce the cost of its internal engineering activities by up to 30%, reduce the scrap and rework it currently deals with from its supply channel by nearly 20% and improves supplier response times by up to 50%.

Conclusion

Model-Based Definition is not as challenging as becoming a Model-Driven enterprise, that I described in my introduction post to this theme. It is a first step to challenge or energize your company to become a digital enterprise, as sharing between engineering and manufacturing needs to be orchestrated, even with your external parties. It is easy to do nothing and to wait till your company is pushed or pushed out, which would cause extra stress (or relieve forever).  For me Model-Based Definition is a first (baby) step towards a digital enterprise, warming-up your company to change a look at your data in a different way. Next when you combine parameters and simulation to your models, you will make the next step towards a model-driven digital enterprise.

 

Below a selection of links related to the theme of Model-Based Definition. If you feel I missed some crucial links, please provide them through the comments section of this post, and I will add them to the post if relevant.

Tech-Clarity: The How-to Guide for Adopting Model Based Definition (MBD)

Action Engineering: Articles, Blog plus training

Engineering.com: How Model-Based Definition Can Fix Your CAD Models

Lifecycle Insights: Quantifying the value of Model-Based definitions

CIMdata: Webinar on Model-Based Definition and Standards

Capgemini: Model-Based Enterprise with 3D PDF

if you want to learn more in-depth the advanced usage and potential of MBD, try to understand:

CIMdata: Minimum MDB and BOM definition with STEP AP 242

I wrote in my previous posts about the various aspects of a model-based enterprise. In case you missed this post you can find it here: Model-Based an introduction. In this post I will zoom in on the aspects related to the 3D model, probably in the context of PLM, the most anticipated approach.

3D CAD vs 3D CAD Model

At the time 3D CAD was introduced for the mid-market, the main reason why 3D CAD was introduced was to provide a better understanding of the designed product. Visualization and creating cross-sections of the design became easy although the “old” generation of 2D draftsmen had to a challenge to transform their way of working. This lead often to 3D CAD models setup with the mindset to generate 2D Manufacturing drawings,  not taking real benefits from the 3D CAD Model. Let’s first focus on Model-Based Definition.

Model-Based Definition

We talk about Model-Based Definition when the product and manufacturing information is embedded / connected to the 3D CAD model, allowing the same source of information to be used downstream for manufacturing, analysis and inspection. The embedded information normally contains geometric dimensions, annotations, surface finish and material specifications. Instead of generating easy to distribute 2D drawings, you would be using the 3D model now with its embedded information.

According to an eBook, sponsored by SolidWorks and published by Tech-Clarity: “The How-to Guide for Adopting Model-Based Definition MBD”, Tech-Clarity’s research discovered that 33 percent of design time is spent on drawing generation. Imagine you do not need this time anymore to specify manufacturing processes and operations.  Does this mean the design activities can be reduced by 30 % ? Probably not, the time could be used to spend on design alternatives too, at the end contributing to better designs.

Still this is not the reason why companies would move to MBD. Companies that have implemented MBD report fewer manufacturing mistakes/less rework (61 %) – here is where the value becomes visible. In addition, improved communication with suppliers was reported by 50 % of the companies. More clarity in the communication, however as some of the suppliers are not used to MBD either, this excuse is used not to implement MBD. Instead of creating a win-win situation a status-quo is created.

Read the eBook to demystify Model-Based Definition and realize that although it might look like a complex change, within 8 to 9 months the company might have gone through this change, assuming you have found the proper trainers / coaches for that.

When discussion a roadmap towards a digital enterprise, this is one of the “easier” steps to take as it does not force the organization to change their primary processes. They become more efficient, lean and integrated, delivering rapid benefits within a year.

In the same context of MBD, in my post: Digital PLM requires a Model-Based Enterprise I referred to two articles in engineering.com written by Dick Bourke with the support from Jennifer Herron.  The first article: How Model-based Definition Can Fix Your CAD Models digs into more detail and provides additional insights into benefits realizable by implementing MBD. As I am not the expert, I would recommend if you agree on the benefits and necessity for your company’s future, find the right literature. There is a lot of information related to MBD coming from vendors but also vendor-neutral sources. Technology Is not the issue. You just have to study, digest and implement it  with your suppliers.

Beyond MDB using a 3D CAD Model

Although the post gets long, it is crucial to understand that the 3D CAD model should also be built in a more sophisticated manner. Using parameters in the model instead of hard-coded values allows the model to be used and interact with other disciplines in a digital manner.

A parametric model, combined with business rules can be accessed and controlled by other applications in a digital enterprise. In this way, without the intervention of individuals a set of product variants can be managed and not only from the design point of view. Geometry and manufacturing parameters are also connected and accessible. This is one of the concepts where Industry 4.0 is focusing on: intelligent and flexible manufacturing by exchanging parameters

The 3D CAD Model and Simulation

The last (short) part related to the 3D CAD Model is about its relation to simulation. If you do no use simulation together with your 3D CAD Models, you are still designing in the past. No real advantage between 2D and 3D, just better understanding?

In engineering we often talk about Form, Fit and Function – the three dimensions to decide on a change.  With 2D (and 3D without simulation) we manage Form and Fit disconnected from Function. Once we use 3D combined with Simulation we are able to manage these three parameters in relation.

For example, when designing product, first simulations can provide direct feedback on shape and dimension constraints. Where to save material costs, choose from another design solution? The ultimate approach is Generative Design where the Functional constraints and the Fit are the given constraints and the Form is optimized based on artificial intelligence rules.

In case a company has a close relation between 3D Design and Simulation, the concept of Design of Experiments (DOE) will help to find the optimal product constraints. The more integrated the 3D CAD model and the simulation are, the more efficient alternatives can be evaluated and optimized.

Conclusion

In this post we focused on model-based in relation to the 3D CAD Model. Without going to the expert level for each of the topics discussed, I hope it creates the interest and enthusiasm for further investment in model-based practices.  One commonality for all model-based practices: it is about parameters. Parameters provide digital continuity where each discipline (design, simulation, manufacturing) can build upon in almost real-time without the need for people to convert or adjust information. Digital Continuity – one of the characteristics of the future digital enterprise

 

 

 

 

 

 

 

The recent years I have been mentioning several times addressing the term model-based in the context of a modern, digital enterprise. Posts like: Digital PLM requires a model-based enterprise (Sept 2016) or Item-Centric or Model-Centric (Sept 2017) describe some of the aspects of a model-based approach. And if you follow the PLM vendors in their marketing messages, everyone seems to be looking for a model-based environment.

This is however in big contrast with reality in the field. In February this year I moderated a focus group related to PLM and the Model-Based approach and the main conclusion from the audience was that everyone was looking at it, and only a few started practicing. Therefore, I promised to provide some step-by-step education related to model-based as like PLM we need to get a grip on what it means and how it impacts your company. As I am not an academic person, it will be a little bit like model-based for dummies, however as model-based in all aspects is not yet a wide-spread common practice, we are all learning.

What is a Model?

The word Model has various meanings and this is often the first confusion when people speak about Model-Based. The two main interpretations in the context of PLM are:

  • A Model represents a 3D CAD Model – a virtual definition of a physical product
  • A Model represents a scientific / mathematical model

And although these are the two main interpretations there are more aspects to look at model-based in the context of a digital enterprise. Let’s explore the 3D CAD Model first

The role of the 3D CAD Model in a digital enterprise

Just designing a product in 3D and then generating 2D drawings for manufacturing is not really game-changing and bringing big benefits. 3D Models provide a better understanding of the product, mechanical simulations allow the engineer to discover clashes and/or conflicts and this approach will contribute to a better understanding of the form & fit of a product. Old generations of designers know how to read a 2D drawing and in their mind understand the 3D Model.

Modern generations of designers are no longer trained to start from 2D, so their way of thinking is related 3D modeling. Unfortunate businesses, in particular when acting in Eco-systems with suppliers, still rely on the 2D definition as the legal document.  The 3D Model has brought some quality improvements and these benefits already justify most of the companies to design in 3D, still it is not the revolution a model-based enterprise can bring.

A model-based enterprise has to rely on data, so the 3D Model should rely on parameters that allow other applications to read them. These parameters can contribute to simulation analysis and product optimization or they can contribute to manufacturing. In both cases the parameters provide data continuity between the various disciplines, eliminating the need to create new representations in different formats. I will come back in a future post to the requirements for the 3D CAD model in the context of the model-based enterprise, where I will zoom in on Model-Based Definition and the concepts of Industry 4.0.

The role of mathematical models in a digital enterprise

The mathematical model of a product allows companies to analyze and optimize the behavior of a product. When companies design a product they often start from a conceptual model and by running simulations they can optimize the product and define low-level requirements within a range that optimizes the product performance. The relation between design and simulation in a virtual model is crucial to be as efficient as possible. In the current ways of working, often design and simulation are not integrated and therefore the amount of simulations is relative low, as time-to-market is the key driver to introduce a new product.

In a digital enterprise, design and simulations are linked through parameters, allowing companies to iterate and select the optimal solution for the market quickly. This part is closely related to model-based systems engineering (MBSE) , where the focus is on defining complex systems. In the context of MBSE I will also zoom in on the relation between hardware and software, which at the end will deliver the desired functionality for the customer. Again in this part we will zoom in on the importance of having a parameter model, to ensure digital continuity.

Digital Twin

There is still a debate if the Digital Twin is part of PLM or should be connected to PLM. A digital twin can be based on a set of parameters that represent the product performance in the field. There is no need to have a 3D representation, despite the fact that many marketing videos always show a virtual image to visualize the twin.

Depending on the business desire, there can be various digital twins for the same products in the field, all depending on the parameters that you want to monitor. Again it is about passing parameters, in this case from the field back to R&D and these parameters should be passed in a digital manner. In a future post I will zoom in on the targets and benefits of the digital twin.

Conclusion

There are various aspects to consider related to “model-based”.  The common thread for each of the aspects is related to PARAMETERS.  The more you can work with parameters to connect the various usages of a product/system, the closer you are related to the digital enterprise. The real advantages of a digital enterprise are speed (information available in real-time), end-to-end visibility (as data is not locked in files / closed systems).

PARAMETERS the objects to create digital continuity

 

 

 

 

When PLM – Product Lifecycle Management – was introduced, one of the main drivers was to provide an infrastructure for collaboration and for sharing product information across the whole lifecycle. The top picture shows my impression of what PLM could mean for an organization at that time. The PLM circle was showing a sequential process from concept, through planning, development, manufacturing towards after sales and/or services when relevant. PLM would provide centralization and continuity of data. Through this continuity we could break down the information silos in a company.

Why do we want to break down the silos?

You might ask yourself what is wrong with silos if they perform in a consistent matter? Oleg Shilovitsky recently wrote about it: How PLM can separate data and organization silos.  Read the post for the full details, I will stay at Oleg’s conclusion:

Keep process and organizational silos, but break data silos. This is should be a new mantra by new PLM organization in 21st century. How to help designers, manufacturing planners and support engineers to stay on the same BOM? By resolving this problem, organization will preserve current functional structure, but will make their decisions extremely data drive and efficient. The new role of PLM is to keep organizational and process silos, but connect data silos. This is a place where new cloud based multi-tenant technologies will play key role in the future organization transformation from the vision of no silo extended enterprise to organized functional silos connected by common understanding of data.

When I read this post I had so much to comment, which lead to this post. Let me share my thoughts related to this conclusion and hopefully it helps in future discussions. Feel free to join the discussion:

Keep process and organizational silos, but break data silos. This is should be a new mantra by new PLM organization in 21st century

For me “Keep process and organizational silos ….. “ is exactly the current state of classical PLM, where PLM concepts are implemented to provide data continuity within a siloed organization. When you can stay close to the existing processes the implementation becomes easier. Less business change needed and mainly a focus on efficiency gains by creating access to information.

Most companies do not want to build their data continuity themselves and therefore select and implement a PLM system that provides the data continuity, currently mainly around the various BOM-views. By selecting a PLM system, you have a lot of data integration done for you by the vendor. Perhaps not as user-friendly as every user would expect, however no company has been able to build a 100% user-friendly PLM system yet, which is the big challenge for all enterprise systems. Therefore PLM vendors provide a lot of data continuity for you without the need for your company to take responsibility for this.

And if you know SAP, they go even further. Their mantra is that when using SAP PLM, you even do not need to integrate with ERP.  You can still have long discussions with companies when it comes to PLM and ERP integrations.  The main complexity is not the technical interface but the agreement who is responsible for which data sets during the product lifecycle. This should be clarified even before you start talking about a technical implementation. SAP claims that this effort is not needed in their environment, however they just shift the problem more towards the CAD-side. Engineers do not feel comfortable with SAP PLM when engineering is driving the success of the company. It is like the Swiss knife; every tool is there but do you want to use it for your daily work?

In theory a company does not need to buy a PLM system. You could build your own PLM-system, based on existing infrastructure capabilities. CAD integrations might be trickier, however this you could solve by connecting to their native environments.  For example, Microsoft presented at several PDT conferences an end-to-end PLM story based on Microsoft technology.  Microsoft “talks PLM” during these conferences, but does not deliver a PLM-system – they deliver the technologies.

The real 21st-century paradigm

What is really needed for the 21st century is to break down the organizational silos as current ways of working are becoming less and less applicable to a modern enterprise. The usage of software has the major impact on how we can work in the future. Software does not follow the linear product process. Software comes with incremental deliveries all the time and yes the software requires still hardware to perform. Modern enterprises try to become agile, being able to react quickly to trends and innovation options to bring higher and different value to their customers.  Related to product innovation this means that the linear, sequential go-to-market process is too slow, requires too much data manipulation by non-value added activities.

All leading companies in the industry are learning to work in a more agile mode with multidisciplinary teams that work like startups. Find an incremental benefit, rapidly develop test and interact with the market and deliver it. These teams require real-time data coming from all stakeholders, therefore the need for data continuity. But also the need for data quality as there is no time to validate data all the time – too expensive – too slow.

Probably these teams will not collaborate along the various BOM-views, but more along digital models, both describing product specifications and system behavior. The BOM is not the best interface to share system information. The model-based enterprise with its various representations is more likely to be the backbone for the new future in the 21st century. I wrote about this several times, e.g. item-centric or model-centric.

And New cloud-based multi-tenant technologies …

As Oleg writes in his conclusion:

This is a place where new cloud-based multi-tenant technologies will play key role in the future organization transformation from the vision of no silo extended enterprise to organized functional silos connected by common understanding of data.

From the academic point of view, I see the beauty of new cloud-based multi-tenant technologies. Quickly build an environment that provides information for specific roles within the organization – however will this view be complete enough?  What about data dictionaries or is every integration a customization?

When talking with companies in the real world, they are not driven by technology – they are driven by processes. They do not like to break down the silos as it creates discomfort and the need for business transformation. And there is no clear answer at this moment. What is clear that leading companies invest in business change first before looking into the technology.

Conclusion

Sometimes too much academic and wishful thinking from technology providers is creating excitement.  Technology is not the biggest game changer for the 21st century. It will be the new ways of working and business models related to a digital enterprise that require breaking organizational silos. And these new processes will create the demand for new technologies, not the other way around.

Break down the walls !

Dear readers, it is time for me to relax and focus on Christmas and a New Year upcoming. I realize that not everyone who reads my posts will be in the same mood. You might have had your New Year three months ago or have New Year coming up in a few months. This is the beauty and challenge of a global, multicultural diverse society. Imagine we are all doing the same, would you prefer such a world ? Perhaps it would give peace to the mind (no surprises, everything predictable) however for human survival we need innovation and new ways of life.

This mindset is also applicable to manufacturing companies. Where in the past companies were trying to optimize and standardize their processes driven by efficiency and predictability, now due to the dynamics of a globally connected world, businesses need to become extremely flexible however still reliable and profitable.

How will they make the change ?

Digital transformation is one of the buzz words pointing to the transition process. Companies need to go through a change to become flexible for the future and deliver products or solutions for the individual customer. Currently companies invest in digital transformation, most of the time in areas that bring direct visibility to the outside world or their own management, not necessarily delivering profitable results as a recent article from McKinsey illustrated: The case for digital reinvention.

And for PLM ?

I have investigated digital transformation in relation to PLM  with particular interest this year as I worked with several companies that preached to the outside world that they are changing or were going to make a change. However what is happening at the PLM level ? Most of the time nothing. Some new tools, perhaps some new disciplines like software engineering become more critical. However the organization and people do not change their ways of working as in particular the ongoing business and related legacy are blocking the change.

Change to ?

This is another difficult question to answer.  There is no clearly defined path to share. Yes, modern PLM will be digital PLM, it will be about data-driven connected information. A final blueprint for digital PLM does not exist yet. We are all learning and guessing.  You can read my thoughts here:

Software vendors in various domains are all contributing to support a modern digital product innovation management future. But where to start?  Is it the product innovation platform? Is it about federated solutions? Model-Based? Graph-databases? There are even people who want to define the future of PLM.  We can keep throwing pieces of the puzzle on the table, but all these pieces will not lead to a single solved puzzle. There will be different approaches based on your industry and your customers. Therefore, continuous learning and investing time to understand the digital future is crucial. This year’s PDT Europe conference was an excellent event to learn and discuss the themes around a model-based lifecycle enterprise. You can read my reviews here: The weekend after PDT Europe 2017 part 1 and part 2.

The next major event where I plan to discuss and learn about modern PLM topics is the upcoming PI PLMx event in Hamburg on February 19-20 organized by MarketKey. Here I will discuss the Model-Based Enterprise and lecture about the relation between PLM and digital transformation. Hoping to see some of you there for exciting discussions and actions.

Conclusion

Merry Christmas for those who are celebrating and a happy, healthy and prosperous 2018 to all of you. Thanks for your feedback. Keep on asking questions or propose other thoughts as we are all learning. The world keeps on turning, however for me the next two weeks will the time relax.

Talk to you in 2018 !

 

When I started working with SmarTeam Corp.  in 1999, the company had several product managers, who were responsible for the whole lifecycle of a component or technology. The Product Manager was the person to define the features for the new release and provide the justification for these new features internally inside R&D.  In addition the Product Manager had the external role to visit customers and understand their needs for future releases and building and explaining a coherent vision to the outside and internal world. The product manager had a central role, connecting all stakeholders.

In the ideal situation the Product Manager was THE person who could speak in R&D-language about the implementation of features, could talk with marketing and documentation teams to explain the value and expected behavior and could talk with the customer describing the vision, meanwhile verifying the product’s vision and roadmap based on their inputs.All these expected skills make the role of a product manager challenging. Is the person too “techy” than he/she will enjoy working with R&D but have a hard time understanding customer demands. From the other side if the Product Manager is excellent in picking-up customer and market feedback he/she might not be heard and get the expected priorities from R&D. For me, it has always been clear that in software world a “bi-directional” Product Manager is crucial to success.

Where are the Product Managers in the Manufacturing Industry?

Approximate four years ago new concepts related to digitalization for PLM became more evident. How could a digital continuity connect the various disciplines around the product lifecycle and therefore provide end-to-end visibility and traceability? When speaking of end-to-end visibility most of the time companies talked about the way they designed and delivered products, visibility of what is happening stopped most of the time after manufacturing. The diagram to the left, showing a typical Build To Order organization illustrates the classical way of thinking. There is an R&D team working on Innovation, typically a few engineers and most of the engineers are working in Sales Engineering and Manufacturing Preparation to define and deliver a customer specific order. In theory, once delivered none of the engineers will be further involved, and it is up to the Service Department to react to what is happening in the field.

A classical process in the PLM domain is the New Product Introduction process for companies that deliver products in large volumes to the market, most of the time configurable to be able to answer to various customer or pricing segments. This process is most of the time linear and is either described in one stream or two parallel streams. In the last case, the R&D department develops new concepts and prepares the full product for the market. However, the operational department starts in parallel, initially involved in strategic sourcing, and later scaling-up manufacturing disconnected from R&D.

I described these two processes because they both illustrate how disconnected the source (R&D/ Sales)  are from the final result in the field. In both cases managed by the service department. A typical story that I learned from many manufacturing companies is that at the end it is hard to get a full picture from what is happening across the whole lifecycle, How external feedback (market & customers) have the option to influence at any stage is undefined. I used the diagram below even  before companies were even talking about a customer-driven digital transformation. Just understanding end-to-end what is happening with a product along the lifecycle is already a challenge for a company.

Putting the customer at the center

Modern business is about having customer or market involvement in the whole lifecycle of the product. And as products become more and more a combination of hardware and software, it is the software that allows the manufacturer to provide incremental innovation to their products. However, to innovate in a manner that is matching or even exceeding customer demands, information from the outside world needs to travel as fast as possible through an organization. In case this is done in isolated systems and documents, the journey will be cumbersome and too slow to allow a company to act fast enough. Here digitization comes in, making information directly available as data elements instead of documents with their own file formats and systems to author them. The ultimate dream is a digital enterprise where date “flows”, advocated already by some manufacturing companies for several years.

In the previous paragraph I talked about the need to have an infrastructure in place for people in an organization to follow the product along the complete lifecycle, to be able to analyze and improve the customer experience. However, you also need to create a role in the organization for a person to be responsible for combining insights from the market and to lead various disciplines in the organization, R&D, Sales, Services. And this is precisely the role of a Product Manager.

Very common in the world of software development, not yet recognized in manufacturing companies. In case a product manager role exists already in your organization, he/she can tell you how complicated it currently is to get an overall view of the product and which benefits a digital infrastructure would bring for their job. Once the product manager is well-supported and recognized in the organization, the right skill set to prioritize or discover actions/features will make the products more attractive for consumers. Here the company will benefit.

Conclusion

If your company does not have the role of a product manager in place, your business is probably not yet well enough engaged in the customer journey.  There will be broken links and costly processes to get a fast response to the market.  Consider the role of a Product Manager, which will emerge as seen from the software business.

NOTE 1: Just before publishing this post I read an interesting post from Jan Bosch: Structure Eats Strategy. Well fitting in this context

NOTE 2: The existence of a Product Manager might be a digital maturity indicator for a company, like for classical PLM maturity, the handling of the MBOM (PDM/PLM/ERP) gives insight into PLM maturity of a company.

Related to the MBOM, please read: The Importance of a PLM data model – EBOM and MBOM

 

 

 

 

 

This post is a rewrite of an article I wrote on LinkedIn two years ago and modified it to my current understanding. When you are following my blog, in particular, the posts related to the business change needed to transform a company towards a data-driven digital enterprise, one of the characteristics of digital is about the real-time availability of information. This has an impact on everyone working in such an organization. My conversations are in the context of PLM (Product Lifecycle Management) however I assume my observations are valid for other domains too.

Real-time visibility is going to be the big differentiator for future businesses, and in particular, in the PLM domain, this requires a change from document-centric processes towards data-driven processes.

Documents have a lot of disadvantages.  Documents lock information in a particular format and document handling results in sequential processes, where one person/one discipline at the time is modifying or adding content. I described the potential change in my blog post: From a linear world to fast and circular?

From a linear world to fast and circular

In that post, I described that a more agile and iterative approach to bring products and new enhancements to the market should have an impact on current organizations. A linear organization, where products are pushed to the market, from concept to delivery, is based on working in silos and will be too slow to compete against future, modern digital enterprises. This because departmental structures with their own hierarchy block fast moving of information, and often these silos perform filtering/deformation of the information.  It becomes hard to have a single version of the truth as every department, and its management will push for their measured truth.

A matching business model related to the digital enterprise is a matrix business model, where multi-disciplinary teams work together to achieve their mission. An approach that is known in the software industry, where parallel and iterative work is crucial to continuous deliver incremental benefits.

Image:  21stcenturypublicservant.wordpress.com/

In a few of my projects, I discovered this correlation with software methodology that I wanted to share. One of my clients was in the middle of moving from a document-centric approach toward a digital information backbone, connecting the RFQ phase and conceptual BOM through design, manufacturing definition, and production. The target was to have end-to-end data continuity as much as possible, meanwhile connecting the quality and project tasks combined with issues to this backbone.

The result was that each individual had a direct view of their current activities, which could be a significant quantity for some people engaged in multiple projects.  Just being able to measure these numbers already lead to more insight into an individual’s workload. At the time we discussed with the implementation team the conceptual dashboard for an individual, it lead to questions like: “Can the PLM system escalate tasks and issues to the relevant manager when needed?” and  “Can this escalation be done automatically? “

And here we started the discussion. “Why do you want to escalate to a manager?”  Escalation will only give more disruption and stress for the persons involved. Isn´t the person qualified enough to make a decision what is important?

One of the conclusions of the discussion was that currently, due to lack of visibility of what needs to be done and when and with which urgency, people accept things get overlooked. So the burning issues get most of the attention and the manager’s role is to make things burning to get it done.

When discussing further, it was clear that thanks to the visibility of data, real critical issues will appear at the top of an individual’s dashboard. The relevant person can immediately overlook what can be achieved and if not, take action. Of course, there is the opportunity to work on the easy tasks only and to ignore the tough ones (human behavior) however the dashboard reveals everything that needs to be done – visibility. Therefore if a person learns to manage their priorities, there is no need for a manager to push anymore, saving time and stress.

The ultimate conclusion of our discussion was: Implementing a modern PLM environment brings first of all almost 100 % visibility, the single version of the truth. This new capability breaks down silos, a department cannot hide activities behind their departmental wall anymore. Digital PLM allows horizontal multidisciplinary collaboration without the need going through the management hierarchy.

It would mean Power to People, in case they are stimulated to do so. And this was the message to the management: “ you have to change too, empower your people.”

What do you think – will this happen? This was my question in 2015.  Now two years later I can say some companies have seen the potential of the future and are changing their culture to empower their employees working in multidisciplinary teams. Other companies, most of the time with a long history in business, are keeping their organizational structure with levels of middle management and maintain a culture that consolidates the past.

Conclusion

A digital enterprise empowers individuals allowing companies to become more proactive and agile instead of working within optimized silos. In silos, it appears that middle management does not trust individuals to prioritize their work.  The culture of a company and its ability to change are crucial for the empowerment of individuals The last two years there is progress in understanding the value of empowered multidisciplinary teams.

Is your company already empowering people ? Let us know !

Note: After speaking with Simon, one of my readers who always gives feedback from reality, we agreed that multidisciplinary teams are very helpful for organizations. However you will still need a layer of strategic people securing standard ways of working and future ways of working as the project teams might be to busy doing their job. We agreed this is the role for modern middle management.
DO YOU AGREE ?

Last week I posted my first review of the PDT Europe conference. You can read the details here: The weekend after PDT Europe (part 1).  There were some questions related to the abbreviation PDT. Understanding the history of PDT, you will discover it stands for Product Data Technology. Yes, there are many TLA’s in this world.

Microsoft’s view on the digital twin

Now back to the conference. Day 2 started with a remote session from Simon Floyd. Simon is Microsoft’s Managing Director for Manufacturing Industry Architecture Enterprise Services and a frequent speaker at PDT. Simon shared with us Microsoft’s viewpoint of a Digital Twin, the strategy to implement a Digit Twin, the maturity status of several of their reference customers and areas these companies are focusing. From these customers it was clear most companies focused on retrieving data in relation to maintenance, providing analytics and historical data. Futuristic scenarios like using the digital twin for augmented reality or design validation. As I discussed in the earlier post, this relates to my observations, where creating a digital thread between products in operations is considered as a quick win. Establishing an end-to-end relationship between products in operation and their design requires many steps to fix. Read my post: Why PLM is the forgotten domain in digital transformation.

When discussing the digital twin architecture, Simon made a particular point for standards required to connect the results of products in the field. Connecting a digital twin in a vendor-specific framework will create a legacy, vendor lock-in, and less open environment to use digital twins. A point that I also raised in my presentation later that day.

Simon concluded with a great example of potential future Artificial Intelligence, where an asset based on its measurements predicts to have a failure before the scheduled maintenance stop and therefore requests to run with a lower performance so it can reach the maintenance stop without disruption.

Closing the lifecycle loop

Sustainability and the circular economy has been a theme at PDT for some years now too. In his keynote speech, Torbjörn Holm from Eurostep took us through the global megatrends (Hay group 2030) and the technology trends (Gartner 2018) and mapped out that technology would be a good enabler to discuss several of the global trends.

Next Torbjörn took us through the reasons and possibilities (methodologies and tools) for product lifecycle circularity developed through the ResCoM project in which Eurostep participated.

The ResCoM project (Resource Conservative Manufacturing) was a project co-funded by the European Commission and recently concluded. More info at www.rescom.eu

Torbjörn concluded discussing the necessary framework for Digital Twin and Digital Thread(s), which should be based on a Model-Based Definition, where ISO 10303 is the best candidate.

Later in the afternoon, there were three sessions in a separate track, related to design optimization for value, circular and re-used followed by a panel discussion. Unfortunate I participated in another track, so I have to digest the provided materials still. Speakers in that track were Ola Isaksson (Chalmers University), Ingrid de Pauw & Bram van der Grinten (IDEAL&CO) and Michael Lieder (KTH Sweden)

Connecting many stakeholders

Rebecca Ihrfors, CIO from the Swedish Defense Material Administration (FMV) shared her plans on transforming the IT landscape to harmonize the current existing environments and to become a broker between industry and the armed forces (FM). As now many of the assets come with their own data sets and PDM/PLM environments, the overhead to keep up all these proprietary environments is too expensive and fragmented. FWM wants to harmonize the data they retrieve from industry and the way they offer it to the armed forces in a secure way. There is a need for standards and interoperability.

The positive point from this presentation was that several companies in the audience and delivering products to Swedish Defense could start to share and adapt their viewpoints how they could contribute.

Later in the afternoon, there were three sessions in a separate track rented to standards for MBE inter-operability and openness that would fit very well in this context. Brian King (Koneksys), Adrian Murton (Airbus UK) and Magnus Färneland (Eurostep) provided various inputs, and as I did not attend these parallel sessions I will dive deeper in their presentations at a later time

PLM something has to change – bimodal and more

In my presentation, which you can download from SlideShare here: PLM – something has to change. My main points were related to the fact that apparently, companies seem to understand that something needs to happen to benefit really from a digital enterprise. The rigidness from large enterprise and their inhibitors to transform are more related to human and incompatibility issues with the future.

How to deal with this incompatibility was also the theme for Martin Eigner’s presentation (System Lifecycle Management as a bimodal IT approach) and Marc Halpern’s closing presentation (Navigating the Journey to Next Generation PLM).

Martin Eigner’s consistent story was about creating an extra layer on top of the existing (Mode 1) systems and infrastructure, which he illustrated by a concept developed based on Aras.

By providing a new digital layer on top of the existing enterprise, companies can start evolving to a modern environment, where, in the long-term, old Mode 1 systems will be replaced by new digital platforms (Mode 2). Oleg Shilovitsky wrote an excellent summary of this approach. Read it here: Aras PLM  platform “overlay” strategy explained.

Marc Halpern closed the conference describing his view on how companies could navigate to the Next Generation PLM by explaining in more detail what the Gartner bimodal approach implies. Marc’s story was woven around four principles.

Principle 1 The bimodal strategy as the image shows.

Principle 2 was about Mode 1 thinking in an evolutionary model. Every company has to go through maturity states in their organization, starting from ad-hoc, departmental, enterprise-based to harmonizing and fully digital integrated. These maturity steps also have to be taken into account when planning future steps.

Principle 3 was about organizational change management, a topic often neglected or underestimated by product vendors or service providers as it relates to a company culture, not easy to change and navigate in a particular direction.

Finally, Principle 4 was about Mode 2 activities. Here an organization should pilot (in a separate environment), certify (make sure it is a realistic future), adopt (integrate it in your business) and scale (enable this new approach to exists and grow for the future).

Conclusions

This post concludes my overview of PDT Europe 2017. Looking back there was a quiet aligned view of where we are all heading with PLM and related topics. There is the hype an there is reality, and I believe this conference was about reality, giving good feedback to all the attendees what is really happening and understood in the field. And of course, there is the human factor, which is hard to influence.

Share your experiences and best practices related to moving to the next generation of PLM (digital PLM ?) !

 

 

 

Translate

Email subscription to this blog

Advertisements
%d bloggers like this: