You are currently browsing the tag archive for the ‘Business Change’ tag.

A month ago I announced to write a series of posts related to the various facets of Model-Based. As I do not want to write a book for a limited audience, I still believe blog posts are an excellent way to share knowledge and experience to a wider audience. Remember PLM is about sharing!

There are three downsides to this approach:

  • you have to chunk the information into pieces; my aim is not to exceed 1000 words per post
  • Isolated posts can be taken out of context (in a positive or negative way)
  • you do not become rich and famous for selling your book

Model-Based ways of working are a hot topic and crucial for a modern digital enterprise.  The modern digital enterprise does not exist yet to my knowledge, but the vision is there. Strategic consultancy firms are all active exploring and explaining the potential benefits – I have mentioned McKinsey / Accenture / Capgemini before.

In the domain of PLM, there is a bigger challenge as here we are suffering from the fact that the word “Model” immediately gets associated with a 3D Model. In addition to the 3D CAD Model, there is still a lot of useful legacy data that does not match with the concepts of a digital enterprise. I wrote and spoke about this topic a year ago. Among others at PI 2017 Berlin and you can  check this presentation on SlideShare: How digital transformation affects PLM

Back to the various aspects of Model-Based

My first post: Model-Based – an introduction described my intentions what I wanted to explain.  I got some interesting feedback and insights from my readers . Some of the people who responded understood that the crucial characteristic of the model-based enterprise is to use models to master a complex environment. Business Models, Mathematical Models, System Models are all part of a model-based enterprise, and none of them have a necessary relation to the 3D CAD model.

Why Model-Based?

Because this is an approach to master complex environments ! If you are studying the concepts for a digital enterprise model, it is complex. Artificial intelligence, predictive actions all need a model to deliver. The interaction and response related to my first blog post did not show any problems – only a positive mindset to further explore. For example, if you read this blog post from Contact, you will see the message came across very well: Model-Based in  Model-Based Systems Engineering – what’s up ?

Where the confusion started

My second post: Why Model-Based? The 3D CAD Model  was related to model-based, focusing on the various aspects related to the 3D CAD model, without going into all the details. In particular, in the PLM world, there is a lot of discussion around Model-Based Design or Model-Based Definition, where new concepts are discussed to connect engineering and manufacturing in an efficient and modern data-driven way. Lifecycle Insights, Action Engineering, Engineering.com, PTC,   Tech-Clarity and many more companies are publishing information related to the model-based engineering phase.

Here is was surprised by Oleg’s blog with his post Model-Based Confusion in 3D CAD and PLM.

If you read his post, you get the impression that the model-based approach is just a marketing issue instead of a significant change towards a digital enterprise. I quote:

Here is the thing… I don’t see much difference between saying PLM-CAD integration sharing data and information for downstream processes and “model-driven” data sharing. It might be a terminology thing, but data is managed by CAD-PLM tools today and accessed by people and other services. This is how things are working today. If model-driven is an approach to replace 2D drawings, I can see it. However, 2D replacement is something that I’ve heard 20 years ago. However, 2D drawings are still massively used by manufacturing companies despite some promises made by CAD vendors long time ago.

I was surprised by the simplicity of this quote. As if CAD vendors are responsible for new ways of working. In particular, automotive and aerospace companies are pushing for a model-based connection between engineering and manufacturing to increase quality, time to market and reduced handling costs. The model-based definition is not just a marketing issue as you can read from benefits reported by Jennifer Herron (Re-use your CAD – the model-based CAD handbook – describing practices and benefits already in 2013) or Tech-Clarity (The How-To Guide for adopting model-based definition – describing practices and benefits – sponsored by SolidWorks)

Oleg’s post unleashed several reactions of people who shared his opinion (read the comments here). They are all confused, t is all about marketing / let’s not change / too complex. Responses you usually hear from a generation that does not feel and understand the new approaches of a digital enterprise. If you are in the field working with multiple customers trying to understand the benefits of model-based definition, you would not worry about terminology – you would try to understand it and make it work.

Model-Based – just marketing?

In his post, Oleg refers to CIMdata’ s explanation of the various aspects of model-based in the context of PLM. Instead of referring to the meaning of the various acronyms, Peter Bilello (CIMdata) presented at the latest PDT conference (Oct 2017 – Gothenburg) an excellent story related to the various aspects of the model-based aspects, actually the whole conference was dedicated to the various aspects of a Model-Based Enterprise illustrates that it is not a vendor marketing issue. You can read my comments from the vendor-neutral conference here: The weekend after PDT Europe 2017 Part 1 and Part 2.

There were some dialogues on LinkedIn this weekend, and I promised to publish this post first before continuing on the other aspects of a model-based enterprise.  Just today Oleg published a secondary post related to this topic: Model-Based marketing in CAD and PLM, where again the tone and blame is to the PLM/CAD vendors, as you can see from his conclusion:

I can see “mode-based” as a new and very interesting wave of marketing in 3D CAD and PLM.  However, it is not pure marketing and it has some rational. The rational part of model-based approach is to have information model combined from 3D design and all connected data element. Such model can be used as a foundation for design, engineering, manufacturing, support, maintenance. Pretty much everything we do. It is hard to create such model and it is hard to combine a functional solution from existing packages and products. You should think how to combine multiple CAD systems, PLM platforms and many other things together. It requires standards. It requires from people to change. And it requires changing of status quo. New approaches in data management can change siloed world of 3D CAD and PLM. It is hard, but nothing to do with slides that will bring shiny words “model-base”. Without changing of technology and people, it will remain as a history of marketing

Again it shows the narrow mindset on the future of a model-based enterprise. When it comes to standards I recommend you to register and watch CIMdata’s educational webinar called: Model-Based Enterprise and Standards – you need to register. John MacKrell CIMdata’s chairman gives an excellent overview and status of model-based enterprise initiative.  After having studied and digested all the links in this post, I challenge you to make your mind up. The picture below comes from John’s presentation, an illustration where we are with model-based definition currently

 

Conclusion

The challenge of modern businesses is that too often we conclude too fast on complex issues or we frame new developments because they do not fit our purpose. You know it from politics. Be aware it is also valid in the world of PLM. Innovation and a path to a modern digital enterprise do not come easy – you need to invest and learn all the aspects. To be continued (and I do not have all the answers either)

Advertisements

When PLM – Product Lifecycle Management – was introduced, one of the main drivers was to provide an infrastructure for collaboration and for sharing product information across the whole lifecycle. The top picture shows my impression of what PLM could mean for an organization at that time. The PLM circle was showing a sequential process from concept, through planning, development, manufacturing towards after sales and/or services when relevant. PLM would provide centralization and continuity of data. Through this continuity we could break down the information silos in a company.

Why do we want to break down the silos?

You might ask yourself what is wrong with silos if they perform in a consistent matter? Oleg Shilovitsky recently wrote about it: How PLM can separate data and organization silos.  Read the post for the full details, I will stay at Oleg’s conclusion:

Keep process and organizational silos, but break data silos. This is should be a new mantra by new PLM organization in 21st century. How to help designers, manufacturing planners and support engineers to stay on the same BOM? By resolving this problem, organization will preserve current functional structure, but will make their decisions extremely data drive and efficient. The new role of PLM is to keep organizational and process silos, but connect data silos. This is a place where new cloud based multi-tenant technologies will play key role in the future organization transformation from the vision of no silo extended enterprise to organized functional silos connected by common understanding of data.

When I read this post I had so much to comment, which lead to this post. Let me share my thoughts related to this conclusion and hopefully it helps in future discussions. Feel free to join the discussion:

Keep process and organizational silos, but break data silos. This is should be a new mantra by new PLM organization in 21st century

For me “Keep process and organizational silos ….. “ is exactly the current state of classical PLM, where PLM concepts are implemented to provide data continuity within a siloed organization. When you can stay close to the existing processes the implementation becomes easier. Less business change needed and mainly a focus on efficiency gains by creating access to information.

Most companies do not want to build their data continuity themselves and therefore select and implement a PLM system that provides the data continuity, currently mainly around the various BOM-views. By selecting a PLM system, you have a lot of data integration done for you by the vendor. Perhaps not as user-friendly as every user would expect, however no company has been able to build a 100% user-friendly PLM system yet, which is the big challenge for all enterprise systems. Therefore PLM vendors provide a lot of data continuity for you without the need for your company to take responsibility for this.

And if you know SAP, they go even further. Their mantra is that when using SAP PLM, you even do not need to integrate with ERP.  You can still have long discussions with companies when it comes to PLM and ERP integrations.  The main complexity is not the technical interface but the agreement who is responsible for which data sets during the product lifecycle. This should be clarified even before you start talking about a technical implementation. SAP claims that this effort is not needed in their environment, however they just shift the problem more towards the CAD-side. Engineers do not feel comfortable with SAP PLM when engineering is driving the success of the company. It is like the Swiss knife; every tool is there but do you want to use it for your daily work?

In theory a company does not need to buy a PLM system. You could build your own PLM-system, based on existing infrastructure capabilities. CAD integrations might be trickier, however this you could solve by connecting to their native environments.  For example, Microsoft presented at several PDT conferences an end-to-end PLM story based on Microsoft technology.  Microsoft “talks PLM” during these conferences, but does not deliver a PLM-system – they deliver the technologies.

The real 21st-century paradigm

What is really needed for the 21st century is to break down the organizational silos as current ways of working are becoming less and less applicable to a modern enterprise. The usage of software has the major impact on how we can work in the future. Software does not follow the linear product process. Software comes with incremental deliveries all the time and yes the software requires still hardware to perform. Modern enterprises try to become agile, being able to react quickly to trends and innovation options to bring higher and different value to their customers.  Related to product innovation this means that the linear, sequential go-to-market process is too slow, requires too much data manipulation by non-value added activities.

All leading companies in the industry are learning to work in a more agile mode with multidisciplinary teams that work like startups. Find an incremental benefit, rapidly develop test and interact with the market and deliver it. These teams require real-time data coming from all stakeholders, therefore the need for data continuity. But also the need for data quality as there is no time to validate data all the time – too expensive – too slow.

Probably these teams will not collaborate along the various BOM-views, but more along digital models, both describing product specifications and system behavior. The BOM is not the best interface to share system information. The model-based enterprise with its various representations is more likely to be the backbone for the new future in the 21st century. I wrote about this several times, e.g. item-centric or model-centric.

And New cloud-based multi-tenant technologies …

As Oleg writes in his conclusion:

This is a place where new cloud-based multi-tenant technologies will play key role in the future organization transformation from the vision of no silo extended enterprise to organized functional silos connected by common understanding of data.

From the academic point of view, I see the beauty of new cloud-based multi-tenant technologies. Quickly build an environment that provides information for specific roles within the organization – however will this view be complete enough?  What about data dictionaries or is every integration a customization?

When talking with companies in the real world, they are not driven by technology – they are driven by processes. They do not like to break down the silos as it creates discomfort and the need for business transformation. And there is no clear answer at this moment. What is clear that leading companies invest in business change first before looking into the technology.

Conclusion

Sometimes too much academic and wishful thinking from technology providers is creating excitement.  Technology is not the biggest game changer for the 21st century. It will be the new ways of working and business models related to a digital enterprise that require breaking organizational silos. And these new processes will create the demand for new technologies, not the other way around.

Break down the walls !

In my earlier post; PLM 2018 my focus, your input, I invited you to send PLM related questions that would spark of a dialogue. As by coincidence Oleg Shilovitsky wrote a post with the catchy title: Why traditional PLM ranking is dead. PLM ranking 2.0. Read this post and the comments if you want to follow this dialogue.

Oleg reacts in this post on the discussion that had started around the Forester Wave ranking PLM Vendors, which on its own is a challenging topic. I know from my experience that these rankings depend very much on a mix of functions and features, but also are profoundly influenced by the slideware and marketing power of these PLM Vendors. Oleg also quotes Joe Barkai’s post: ranking PLM Vendors to illustrate that this kind of ranking does not bring a lot of value as there is so much commonality between these systems.

I agree with Oleg and Joe. PLM ranking does not make sense for companies to select a PLM solution. They are more an internal PLM show, useful for the organizing consultancy companies to conduct, but at the end, it is a discussion about who has the biggest and most effective button. Companies need to sell themselves and differentiate.

Do we need consultancy?

We started a dialogue on the comments of Oleg’s blog post where I mentioned that PLM is not about selecting a solution from a vendor, there are many other facets related to a PLM implementation. First of all, the industry your company is active in. No solution fits all industries.

But before selecting a solution, you first need to understand what does a company want to achieve in the future. What is the business strategy and how can PLM support this business strategy?

In most cases, a strategy is future-oriented and not about consolidating the current status quo. Therefore I believe a PLM implementation is always done in the context of a business transformation, which is most of the time not only related to PLM – it is about People, Processes and then the tools.

Oleg suggests that this complexity is created by the consulting business, as he writes:

Complex business and product strategies are good for consulting business you do. High level of complexity with high risk of failure for expensive PLM projects is a perfect business environment to sell consulting. First create complexity and then hire consulting people to explain how to organize processes and build business and product strategy. Win-win

Enterprise and engineering IT are hiring consulting to cover their decision process. That was a great point made by Joe Barkai- companies are buying roadmaps and long-term commitments, but rarely technologies. Technologies can be developed, and if even something is missed, you can always acquire independent vendors or technology later – it was done many times by many large ISVs in the past.

Here I agree with a part of the comments. If you hire consultancy firms just for the decision process, it does not make sense/ The decision process needs to be owned by the company. Do not let a consultancy company prescribe your (PLM) strategy as there might be mixed interests. However, when it comes to technologies, they are derived from the people and process needs.

So when I write in the comment:

We will not change the current status quo and ranking processes very soon. Technology is an enabler, but you need a top-down push to work different (at least for those organizations that read vendor rankings).

Oleg states:

However, the favorite part of your comments is this – “We will not change the current status quo and ranking processes very soon.” Who are “we”???? Management consulting people?

With “we” I do not mean the consulting people. In general, the management of companies is more conservative than consultants are. It is our human brain that is change averse and pushes people to stay in a kind of mainstream mode. In that context, the McKinsey article: How biases, politics, and egos derail business decisions is a fascinating read about company dynamics. Also, CIMdata published in the past a slide illustrating the gap between vision, real capabilities and where companies really are aiming at.

There is such a big gap between where companies are and what it possible. Software vendors describe the ideal world but do not have a migration path. One of the uncomfortable discussions is when discussing a cloud solution is not necessary security (topic #1) but what is your exit strategy? Have you ever thought about your data in a cloud solution and the vendor raises prices or does no longer have a viable business model. These are discussions that need to take place too.

Oleg also quotes a CIMdata cloud PLM research how companies are looking for solutions as they are “empowered” by the digital world. Oleg states:

In a digital world, companies are checking websites, technologies, watching YouTube and tried products available online. Recent cloud PLM research published by CIMdata tells that when companies are thinking about cloud PLM, the first check they do is independent software providers recommendations and websites (not business process consultants).

I am wondering the value of this graph. The first choice is independent software recommendations/websites.  Have you ever seen independent software recommendations?

Yes, when it comes to consumer tools. “I like software A because it gives me the freedom what to do” or “Software B has so many features for such a low price – great price/value ratio.”

These are the kind of reviews you find on the internet for consumers. Don’t try to find answers on a vendor website as there you will get no details, only the marketing messages.

I understand that software vendors, including Oleg’s company OpenBOM, needs to differentiate by explaining that the others are too complex. It is the same message you hear from all the relative PLM newcomers, Aras, Autodesk, …….

All these newcomers provide marketing stories and claim successes because of their tools, where reality is the tool is secondary to the success. First, you need the company to have a vision and a culture that matches this tool. Look at an old Gartner picture (the hockey stick projection) when all is aligned. The impact of the tool is minimal.

Conclusion

Despite democratization of information, PLM transformations will still need consultants or a well-educated workforce inside your company. Consultants have the advantage of collected experience, which often is not the case when you work inside a company. We should all agree that at the end it is about the business first (human beings are complex) and then the tools (here you can shop on the internet what matches the vision)

Although this post seems like ping-pong match of arguments, I challenge you to take part of this discussion. Tell us where you agree or disagree combined with argumentation as we should realize the argumentation is the most valuable point.
Your thoughts?

Happy New Year to all of you. A new year comes traditionally with good intentions for the upcoming year.  I would like to share my PLM intentions for this year with you and look forward to your opinion. I shared some of my 2017 thoughts in my earlier post: Time for a Break. This year will I focus on the future of PLM in a digital enterprise, current PLM practices and how to be ready for the future.

Related to these activities I will zoom in on people-related topics, like organizational change, business impact and PLM justification in an enterprise. When it happens during the year, or based on your demands, I will zoom in on architectural stuff and best practices.

The future of PLM

Accenture – Digital PLM

At this moment digital transformation is on the top of the hype curve and the impact varies of course per industry. For sure at the company’s C-level managers will be convinced they have the right vision and the company is on the path to success.

Statements like: “We will be the first digital industrial enterprise” or “We are now a software company” impress the outside world and often investors in the beginning.

 

Combined with investments in customer related software platforms a new digital world is relative fast created facing the outside world.  And small pilots are celebrated as significant successes.

What we do not see is that to show and reap the benefits of digital transformation companies need to do more than create a modern, outside facing infrastructure. We need to be able to connect and improve the internal data flow in an efficient way to stay competitive. Buzzwords like digital thread and digital twin are relevant here.

To my understanding we are still in the early phases of discovering the ideal architecture and practices for a digital enterprise. PLM Vendors and technology companies show us the impressive potential as-if the future already exists already now. Have a reality check from Marc Halpern (Gartner) in this article on engineering.com – Digital Twins: Beware of Naive Faith in Simplicity.

I will focus this year on future PLM combined with reality, hopefully with your support for real cases.

Current PLM practices

Although my curiosity is focused on future PLM, there is still a journey to go for companies that have just started with PLM.  Before even thinking of a digital enterprise, there is first a need to understand and implement PLM as an infrastructure outside the engineering department.

Many existing PLM implementations are actually more (complex) document management systems supporting engineering data, instead of using all available capabilities of a modern PLM systems. Topics like Systems Engineering, multidisciplinary collaboration, Model-Based Enterprise, EBOM-MBOM handling, non-intelligent numbering are all relevant for current and future PLM.

Not exploring and understanding them in your current business will make the gap towards the future even bigger. Therefore, keep on sending your questions and when time allows I will elaborate. For example, see last year’s PLM dialogue – you find these posts here: PLM dialogue and PLM dialogue (continued). Of course I will share my observations in this domain too when I bump into them.

 

To be ready for the future

The most prominent challenge for most companies however is how to transform their existing business towards a modern digital business where new processes and business opportunities need to be implemented inside an existing enterprise. These new processes and business opportunities are not just simple extensions of the current activities, they need new ways of working like delivering incremental results through agile and multidisciplinary teams. And these ways of working combined with never-existing-before interactivity with the market and the customer.

How to convince management that these changes are needed and do not happen without their firm support? It is easier to do nothing and push for small incremental changes. But will this be fast enough? Probably not as you can read from research done by strategic consultancy firms. There is a lot of valuable information available if you invest time in research. But spending time is a challenge for management.

I hope to focus on these challenges too, as all my clients are facing these challenges. Will I be able to help them? I will share successes and pitfalls with you, combined supporting information that might be relevant for others

Your input?

A blog is a modern way of communicating with anyone connected in the world. What I would like to achieve this year is to be more interactive. Share your questions – there are no stupid questions as we are all learning. By sharing and learning we should be able to make achievable steps and become PLM winners.

Best wishes to us all and be a winner not a tweeter …..

 

 

This post is a rewrite of an article I wrote on LinkedIn two years ago and modified it to my current understanding. When you are following my blog, in particular, the posts related to the business change needed to transform a company towards a data-driven digital enterprise, one of the characteristics of digital is about the real-time availability of information. This has an impact on everyone working in such an organization. My conversations are in the context of PLM (Product Lifecycle Management) however I assume my observations are valid for other domains too.

Real-time visibility is going to be the big differentiator for future businesses, and in particular, in the PLM domain, this requires a change from document-centric processes towards data-driven processes.

Documents have a lot of disadvantages.  Documents lock information in a particular format and document handling results in sequential processes, where one person/one discipline at the time is modifying or adding content. I described the potential change in my blog post: From a linear world to fast and circular?

From a linear world to fast and circular

In that post, I described that a more agile and iterative approach to bring products and new enhancements to the market should have an impact on current organizations. A linear organization, where products are pushed to the market, from concept to delivery, is based on working in silos and will be too slow to compete against future, modern digital enterprises. This because departmental structures with their own hierarchy block fast moving of information, and often these silos perform filtering/deformation of the information.  It becomes hard to have a single version of the truth as every department, and its management will push for their measured truth.

A matching business model related to the digital enterprise is a matrix business model, where multi-disciplinary teams work together to achieve their mission. An approach that is known in the software industry, where parallel and iterative work is crucial to continuous deliver incremental benefits.

Image:  21stcenturypublicservant.wordpress.com/

In a few of my projects, I discovered this correlation with software methodology that I wanted to share. One of my clients was in the middle of moving from a document-centric approach toward a digital information backbone, connecting the RFQ phase and conceptual BOM through design, manufacturing definition, and production. The target was to have end-to-end data continuity as much as possible, meanwhile connecting the quality and project tasks combined with issues to this backbone.

The result was that each individual had a direct view of their current activities, which could be a significant quantity for some people engaged in multiple projects.  Just being able to measure these numbers already lead to more insight into an individual’s workload. At the time we discussed with the implementation team the conceptual dashboard for an individual, it lead to questions like: “Can the PLM system escalate tasks and issues to the relevant manager when needed?” and  “Can this escalation be done automatically? “

And here we started the discussion. “Why do you want to escalate to a manager?”  Escalation will only give more disruption and stress for the persons involved. Isn´t the person qualified enough to make a decision what is important?

One of the conclusions of the discussion was that currently, due to lack of visibility of what needs to be done and when and with which urgency, people accept things get overlooked. So the burning issues get most of the attention and the manager’s role is to make things burning to get it done.

When discussing further, it was clear that thanks to the visibility of data, real critical issues will appear at the top of an individual’s dashboard. The relevant person can immediately overlook what can be achieved and if not, take action. Of course, there is the opportunity to work on the easy tasks only and to ignore the tough ones (human behavior) however the dashboard reveals everything that needs to be done – visibility. Therefore if a person learns to manage their priorities, there is no need for a manager to push anymore, saving time and stress.

The ultimate conclusion of our discussion was: Implementing a modern PLM environment brings first of all almost 100 % visibility, the single version of the truth. This new capability breaks down silos, a department cannot hide activities behind their departmental wall anymore. Digital PLM allows horizontal multidisciplinary collaboration without the need going through the management hierarchy.

It would mean Power to People, in case they are stimulated to do so. And this was the message to the management: “ you have to change too, empower your people.”

What do you think – will this happen? This was my question in 2015.  Now two years later I can say some companies have seen the potential of the future and are changing their culture to empower their employees working in multidisciplinary teams. Other companies, most of the time with a long history in business, are keeping their organizational structure with levels of middle management and maintain a culture that consolidates the past.

Conclusion

A digital enterprise empowers individuals allowing companies to become more proactive and agile instead of working within optimized silos. In silos, it appears that middle management does not trust individuals to prioritize their work.  The culture of a company and its ability to change are crucial for the empowerment of individuals The last two years there is progress in understanding the value of empowered multidisciplinary teams.

Is your company already empowering people ? Let us know !

Note: After speaking with Simon, one of my readers who always gives feedback from reality, we agreed that multidisciplinary teams are very helpful for organizations. However you will still need a layer of strategic people securing standard ways of working and future ways of working as the project teams might be to busy doing their job. We agreed this is the role for modern middle management.
DO YOU AGREE ?

Last week I posted my first review of the PDT Europe conference. You can read the details here: The weekend after PDT Europe (part 1).  There were some questions related to the abbreviation PDT. Understanding the history of PDT, you will discover it stands for Product Data Technology. Yes, there are many TLA’s in this world.

Microsoft’s view on the digital twin

Now back to the conference. Day 2 started with a remote session from Simon Floyd. Simon is Microsoft’s Managing Director for Manufacturing Industry Architecture Enterprise Services and a frequent speaker at PDT. Simon shared with us Microsoft’s viewpoint of a Digital Twin, the strategy to implement a Digit Twin, the maturity status of several of their reference customers and areas these companies are focusing. From these customers it was clear most companies focused on retrieving data in relation to maintenance, providing analytics and historical data. Futuristic scenarios like using the digital twin for augmented reality or design validation. As I discussed in the earlier post, this relates to my observations, where creating a digital thread between products in operations is considered as a quick win. Establishing an end-to-end relationship between products in operation and their design requires many steps to fix. Read my post: Why PLM is the forgotten domain in digital transformation.

When discussing the digital twin architecture, Simon made a particular point for standards required to connect the results of products in the field. Connecting a digital twin in a vendor-specific framework will create a legacy, vendor lock-in, and less open environment to use digital twins. A point that I also raised in my presentation later that day.

Simon concluded with a great example of potential future Artificial Intelligence, where an asset based on its measurements predicts to have a failure before the scheduled maintenance stop and therefore requests to run with a lower performance so it can reach the maintenance stop without disruption.

Closing the lifecycle loop

Sustainability and the circular economy has been a theme at PDT for some years now too. In his keynote speech, Torbjörn Holm from Eurostep took us through the global megatrends (Hay group 2030) and the technology trends (Gartner 2018) and mapped out that technology would be a good enabler to discuss several of the global trends.

Next Torbjörn took us through the reasons and possibilities (methodologies and tools) for product lifecycle circularity developed through the ResCoM project in which Eurostep participated.

The ResCoM project (Resource Conservative Manufacturing) was a project co-funded by the European Commission and recently concluded. More info at www.rescom.eu

Torbjörn concluded discussing the necessary framework for Digital Twin and Digital Thread(s), which should be based on a Model-Based Definition, where ISO 10303 is the best candidate.

Later in the afternoon, there were three sessions in a separate track, related to design optimization for value, circular and re-used followed by a panel discussion. Unfortunate I participated in another track, so I have to digest the provided materials still. Speakers in that track were Ola Isaksson (Chalmers University), Ingrid de Pauw & Bram van der Grinten (IDEAL&CO) and Michael Lieder (KTH Sweden)

Connecting many stakeholders

Rebecca Ihrfors, CIO from the Swedish Defense Material Administration (FMV) shared her plans on transforming the IT landscape to harmonize the current existing environments and to become a broker between industry and the armed forces (FM). As now many of the assets come with their own data sets and PDM/PLM environments, the overhead to keep up all these proprietary environments is too expensive and fragmented. FWM wants to harmonize the data they retrieve from industry and the way they offer it to the armed forces in a secure way. There is a need for standards and interoperability.

The positive point from this presentation was that several companies in the audience and delivering products to Swedish Defense could start to share and adapt their viewpoints how they could contribute.

Later in the afternoon, there were three sessions in a separate track rented to standards for MBE inter-operability and openness that would fit very well in this context. Brian King (Koneksys), Adrian Murton (Airbus UK) and Magnus Färneland (Eurostep) provided various inputs, and as I did not attend these parallel sessions I will dive deeper in their presentations at a later time

PLM something has to change – bimodal and more

In my presentation, which you can download from SlideShare here: PLM – something has to change. My main points were related to the fact that apparently, companies seem to understand that something needs to happen to benefit really from a digital enterprise. The rigidness from large enterprise and their inhibitors to transform are more related to human and incompatibility issues with the future.

How to deal with this incompatibility was also the theme for Martin Eigner’s presentation (System Lifecycle Management as a bimodal IT approach) and Marc Halpern’s closing presentation (Navigating the Journey to Next Generation PLM).

Martin Eigner’s consistent story was about creating an extra layer on top of the existing (Mode 1) systems and infrastructure, which he illustrated by a concept developed based on Aras.

By providing a new digital layer on top of the existing enterprise, companies can start evolving to a modern environment, where, in the long-term, old Mode 1 systems will be replaced by new digital platforms (Mode 2). Oleg Shilovitsky wrote an excellent summary of this approach. Read it here: Aras PLM  platform “overlay” strategy explained.

Marc Halpern closed the conference describing his view on how companies could navigate to the Next Generation PLM by explaining in more detail what the Gartner bimodal approach implies. Marc’s story was woven around four principles.

Principle 1 The bimodal strategy as the image shows.

Principle 2 was about Mode 1 thinking in an evolutionary model. Every company has to go through maturity states in their organization, starting from ad-hoc, departmental, enterprise-based to harmonizing and fully digital integrated. These maturity steps also have to be taken into account when planning future steps.

Principle 3 was about organizational change management, a topic often neglected or underestimated by product vendors or service providers as it relates to a company culture, not easy to change and navigate in a particular direction.

Finally, Principle 4 was about Mode 2 activities. Here an organization should pilot (in a separate environment), certify (make sure it is a realistic future), adopt (integrate it in your business) and scale (enable this new approach to exists and grow for the future).

Conclusions

This post concludes my overview of PDT Europe 2017. Looking back there was a quiet aligned view of where we are all heading with PLM and related topics. There is the hype an there is reality, and I believe this conference was about reality, giving good feedback to all the attendees what is really happening and understood in the field. And of course, there is the human factor, which is hard to influence.

Share your experiences and best practices related to moving to the next generation of PLM (digital PLM ?) !

 

 

 

PDT Europe is over, and it was this year a surprising aligned conference, showing that ideas and concepts align more and more for modern PLM. Håkan Kårdén opened the conference mentioning the event was fully booked, about 160 attendees from over 19 countries. With a typical attendance of approx. 120 participants, this showed the theme of the conference: Continuous Transformation of PLM to support the Lifecycle Model-Based Enterprise was very attractive and real. You can find a history of tweets following the hashtag #pdte17

Setting the scene

Peter Bilello from CIMdata kicked-off by bringing some structure related to the various Model-Based areas and Digital Thread. Peter started by mentioning that technology is the least important issue as organization culture, changing processing and adapting people skills are more critical factors for a successful adoption of modern PLM. Something that would repeatedly be confirmed by other speakers during the conference.

Peter presented a nice slide bringing the Model-Based terminology together on one page. Next, Peter took us through various digital threads in the different stages of the product lifecycle. Peter concluded with the message that we are still in a learning process redefining optimal processes for PLM, using Model-Based approaches and Digital Threads and thanks (or due) to digitalization these changes will be rapid. Ending with an overall conclusion that we should keep in mind:


It isn’t about what we call digitalization; It is about delivering value to customers and all other stakeholders of the enterprise

Next Marc Halpern busted the Myth of Digital Twins (according to his session title) and looked into realistic planning them. I am not sure if Marc smashed some of the myths although it is sure Digital Twin is at the top of the hype cycle and we are all starting to look for practical implementations. A digital twin can have many appearances and depends on its usage. For sure it is not just a 3D Virtual model.

There are still many areas to consider when implementing a digital twin for your products. Depending on what and how you apply the connection between the virtual and the physical model, you have to consider where your vendor really is in maturity and avoid lock in on his approach. In particular, in these early stages, you are not sure which technology will last longer, and data ownership and confidentially will play an important role. And opposite to quick wins make sure your digital twin is open and use as much as possible open standards to stay open for the future, which also means keep aiming for working with multiple vendors.

Industry sessions

Next, we had industry-focused sessions related to a lifecycle Model-Based enterprise and later in the afternoon a session from Outotec with the title: Managing Installed Base to Unlock Service opportunities.

The first presentation from Väino Tarandi, professor in IT in Construction at KTH Sweden presented his findings related to BIM and GIS in the context of the lifecycle, a test bed where PLCS meets IFC. Interesting as I have been involved in BIM Level 3 discussions in the UK, which was already an operational challenge for stakeholders in the construction industry now extended with the concept of the lifecycle. So far these projects are at the academic level, and I am still waiting for companies to push and discover the full benefits of an integrated approach.

Concepts for the industrial approach could be learned from Outotec as you might understand later in this post. Of course the difference is that Outotec is aiming for data ownership along the lifecycle, where in case of the construction industries, each silo often is handled by a different contractor.

Fredrik Ekström from Swedish Transport Administration shared his challenges of managing assets for both road and railway transport – see image on the left. I have worked around this domain in the Netherlands, where asset management for infrastructure and asset management for the rail infrastructure are managed in two different organizations. I believe Fredrik (and similar organizations) could learn from the concepts in other industries. Again Outotec’s example is also about having relevant information to increase service capabilities, where the Swedish Transport Administration is aiming to have the right data for their services. When you look at the challenges reported by Fredrik, I assume he can find the answers in other industry concepts.

Outotec’s presentation related to managing installed base and unlock service opportunities explained by Sami Grönstrand and Helena Guiterrez was besides entertaining easy to digest content and well-paced. Without being academic, they explained somehow the challenges of a company with existing systems in place moving towards concepts of a digital twin and the related data management and quality issues. Their practical example illustrated that if you have a clear target, understanding better a customer specific environment to sell better services, can be achieved by rational thinking and doing, a typical Finish approach. This all including the “bi-modal approach” and people change management.

Future Automotive

Ivar Hammarstadt, Senior Analyst Technology Intelligence for Volvo Cars Corporation entertained us with a projection toward the future based on 160 years of automotive industry. Interesting as electrical did not seem to be the only way to go for a sustainable future depending on operational performance demands.

 

Next Jeanette Nilsson and Daniel Adin from Volvo Group Truck shared their findings related to an evaluation project for more than one year where they evaluated the major PLM Vendors (Dassault Systemes / PTC / Siemens) on their Out-of-the-box capabilities related to 3D product documentation and manufacturing.

They concluded that none of the vendors were able to support the full Volvo Truck complexity in a OOTB matter. Also, it was a good awareness project for Volvo Trucks organization to understand that a common system for 3D geometry reduces the need for data transfers and manual data validation. Cross-functional iterations can start earlier, and more iterations can be performed. This will support a shortening of lead time and improve product quality. Personally, I believe this was a rather expensive approach to create awareness for such a conclusion, pushing PLM vendors in a competitive pre-sales position for so much detail.

Future Aerospace

Kenny Swope from Boeing talked us through the potential Boeing journey towards a Model-Based Enterprise. Boeing has always been challenging themselves and their partners to deliver environments close to what is possible. Look at the Boeing journey and you can see that already in 2005 they were aiming for an approach that most of current manufacturing enterprises cannot meet. And now they are planning their future state.

To approach the future state Boeing aims to align their business with a single architecture for all aspects of the company. Starting with collecting capabilities (over 400 in 6 levels) and defining value streams (strategic/operational) the next step is mapping the capabilities to the value streams.  Part of the process would be to look at the components of a value stream if they could be fulfilled by a service. In this way you design your business for a service-oriented architecture, still independent from any system constraints. As Kenny states the aerospace and defense industry has a long history and therefore slow to change as its culture is rooted in the organization. It will be interesting to learn from Kenny next hear how much (mandatory) progress towards a model-based enterprise has been achieved and which values have been confirmed.

Gearing up for day 2

Martin Eigner took us in high-speed mode through his vision and experience working in a bi-modular approach with Aras to support legacy environments and a modern federated layer to support the complexity of a digital enterprise where the system architecture is leading. I will share more details on these concepts in my next post as during day 2 of PDT Europe both Marc Halpern and me were talking related to this topic, and I will combine it in a more extended story.

The last formal presentation for day one was from Nigel Shaw from Eurostep Ltd where he took us through the journey of challenges for a model-based enterprise. As there will not be a single model that defines all, it will be clear various models and derived models will exist for a product/system.  Interesting was Nigel’s slide showing the multiple models disciplines can have from an airplane (1948). Similar to the famous “swing” cartoon, used to illustrate that every single view can be entirely different from the purpose of the product.

Next are these models consistent and still describing the same initial specified system. On top of that, even the usage of various modeling techniques and tools will lead to differences in the system. And the last challenge on top is managing the change over the system’s lifecycle. From here Nigel stepped into the need for digital threads to govern relations between the various views per discipline and lifecycle stage, not only for the physical and the virtual twin.  When comparing the needs of a model-based enterprise through its lifecycle, Nigel concluded that using PLCS as a framework provides an excellent fit to manage such complexity.

Finally, after a panel discussion, which was more a collection of opinions as the target was not necessary to align in such a short time, it was time for the PDT dinner always an excellent way to share thoughts and verify them with your peers.

Conclusion

Day 1 was over before you knew it without any moment of boredom and so I hope is also this post. Next week I will close reviewing the PDT conference with some more details about my favorite topics.

 

As I am preparing my presentation for the upcoming PDT Europe 2017 conference in Gothenburg, I was reading relevant experiences to a data-driven approach. During PDT Europe conference we will share and discuss the continuous transformation of PLM to support the Lifecycle Model-Based Enterprise. 

One of the direct benefits is that a model-based enterprise allows information to be shared without the need to have documents to be converted to a particular format, therefore saving costs for resources and bringing unprecedented speed for information availability, like what we are used having in a modern digital society.

For me, a modern digital enterprise relies on data coming from different platforms/systems and the data needs to be managed in such a manner that it can serve as a foundation for any type of app based on federated data.

This statement implies some constraints. It means that data coming from various platforms or systems must be accessible through APIs / Microservices or interfaces in an almost real-time manner. See my post Microservices, APIs, Platforms and PLM Services. Also, the data needs to be reliable and understandable for machine interpretation. Understandable data can lead to insights and predictive analysis. Reliable and understandable data allows algorithms to execute on the data.

Classical ECO/ECR processes can become highly automated when the data is reliable, and the company’s strategy is captured in rules. In a data-driven environment, there will be much more granular data that requires some kind of approval status. We cannot do this manually anymore as it would kill the company, too expensive and too slow. Therefore, the need for algorithms.

What is understandable data?

I tried to avoid as long as possible academic language, but now we have to be more precise as we enter the domain of master data management. I was triggered by this recent post from Gartner: Gartner Reveals the 2017 Hype Cycle for Data Management. There are many topics in the hype cycle, and it was interesting to see Master Data Management is starting to be taken seriously after going through inflated expectations and disillusionment.

This was interesting as two years ago we had a one-day workshop preceding PDT Europe 2015, focusing on Master Data Management in the context of PLM. The attendees at that workshop coming from various companies agreed that there was no real MDM for the engineering/manufacturing side of the business. MDM was more or less hijacked by SAP and other ERP-driven organizations.

Looking back, it is clear to me why in the PLM space MDM was not a real topic at that time. We were still too much focusing and are again too much focusing on information stored in files and documents. The only area touched by MDM was the BOM, and Part definitions as these objects also touch the ERP- and After Sales-  domain.

Actually, there are various MDM concepts, and I found an excellent presentation from Christopher Bradley explaining the different architectures on SlideShare: How to identify the correct Master Data subject areas & tooling for your MDM initiative. In particular, I liked the slide below as it comes close to my experience in the process industry

Here we see two MDM architectures, the one of the left driven from ERP. The one on the right could be based on the ISO-15926 standard as the process industry has worked for over 25 years to define a global exchange standard and data dictionary. The process industry was able to reach such a maturity level due to the need to support assets for many years across the lifecycle and the relatively stable environment. Other sectors are less standardized or so much depending on new concepts that it would be hard to have an industry-specific master.

PLM as an Application Specific Master?

If you would currently start with an MDM initiative in your company and look for providers of MDM solution, you will discover that their values are based on technology capabilities, bringing data together from different enterprise systems in a way the customer thinks it should be organized. More a toolkit approach instead of an industry approach. And in cases, there is an industry approach it is sporadic that this approach is related to manufacturing companies. Remember my observation from 2015: manufacturing companies do not have MDM activities related to engineering/manufacturing because it is too complicated, too diverse, too many documents instead of data.

Now with modern digital PLM, there is a need for MDM to support the full digital enterprise. Therefore, when you combine the previous observations with a recent post on Engineering.com from Tom Gill: PLM Initiatives Take On Master Data Transformation I started to come to a new hypotheses:

For companies with a model-based approach that has no MDM in place, the implementation of their Product Innovation Platform (modern PLM) should be based on the industry-specific data definition for this industry.

Tom Gill explains in his post the business benefits and values of using the PLM as the source for an MDM approach. In particular, in modern PLM environments, the PLM data model is not only based on the BOM.  PLM now encompasses the full lifecycle of a product instead of initially more an engineering view. Modern PLM systems, or as CIMdata calls them Product Innovation Platforms, manage a complex data model, based on a model-driven approach. These entities are used across the whole lifecycle and therefore could be the best start for an industry-specific MDM approach. Now only the industries have to follow….

Once data is able to flow, there will be another discussion: Who is responsible for which attributes. Bjørn Fidjeland from plmPartner recently wrote: Who owns what data when …?  The content of his post is relevant, I only would change the title: Who is responsible for what data when as I believe in a modern digital enterprise there is no ownership anymore – it is about sharing and responsibilities

 

Conclusion

Where MDM in the past did not really focus on engineering data due to the classical document-driven approach, now in modern PLM implementations, the Master Data Model might be based on the industry-specific data elements, managed and controlled coming from the PLM data model

 

Do you follow my thoughts / agree ?

 

 

Last week I published a dialogue I had with Flip van der Linden, a fellow Dutchman and millennial, eager to get a grip on current PLM. You can read the initial post here: A PLM dialogue.  In the comments, Flip continued the discussion (look here).  I will elaborate om some parts of his comments and hope some others will chime in. It made me realize that in the early days of blogging and LinkedIn, there were a lot of discussions in the comments. Now it seems we become more and more consumers or senders of information, instead of having a dialogue. Do you agree? Let me know.

Point 1

(Flip) PLM is changing – where lies the new effort for (a new generation of) PLM experts.  I believe a huge effort for PLM is successful change management towards ‘business Agility.’ Since a proper response to an ECR/ECO would evidently require design changes impacting manufacturing and even after-sales and/or legal.  And that’s just the tip of the iceberg.

 

You are right, the main challenge for future PLM experts is to explain and support more agile processes, mainly because software has become a major part of the solution. The classical, linear product delivery approach does not match the agile, iterative approach for software deliveries. The ECR/ECO process has been established to control hardware changes, in particular because there was a big impact on the costs. Software changes are extremely cheap and possible fast, leading to different change procedures. The future of PLM is about managing these two layers (hardware/software) together in an agile way. The solution for this approach is that people have to work in multi-disciplinary teams with direct (social) collaboration and to be efficient this collaboration should be done in a digital way.

A good article to read in this context is Peter Bilello’s article: Digitalisation enabled by product lifecycle management.

 

(Flip) What seems to be missing is an ‘Archetype’ of the ideal transformed organization. Where do PLM experts want to go with these businesses in practice? Personally, I imagine a business where DevOps is the standard, unique products have generic meta-data, personal growth is an embedded business process and supply chain related risks are anticipated on and mitigated through automated analytics. Do you know of such an evolved archetypal enterprise model?

I believe the ideal archetype does not exist yet. We are all learning, and we see examples from existing companies and startups pitching their story for a future enterprise. Some vendors sell a solution based on their own product innovation platform, others on existing platforms and many new vendors are addressing a piece of the puzzle, to be connected through APIs or Microservices. I wrote about these challenges in Microservices, APIs, Platforms and PLM Services.  Remember, it took us “old PLM experts” more than 10-15 years to evolve from PDM towards PLM, riding on an old linear trajectory, caught up by a new wave of iterative and agile processes. Now we need a new generation of PLM experts (or evolving experts) that can combine the new concepts and filter out the nonsense.

Point 2

(Flip) But then given point 2: ‘Model-based enterprise transformations,’ in my view, a key effort for a successful PLM expert would also be to embed this change mgt. as a business process in the actual Enterprise Architecture. So he/she would need to understand and work out a ‘business-ontology’ (Dietz, 2006) or similar construct which facilitates at least a. business processes, b. Change (mgt.) processes, c. emerging (Mfg.) technologies, d. Data structures- and flows, e. implementation trajectory and sourcing.

And then do this from the PLM domain throughout the organization per optimization.  After all a product-oriented enterprise revolves around the success of its products, so eventually, all subsystems are affected by the makeup of the product lifecycle. Good PLM is a journey, not a trip. Or, does a PLM expert merely facilitates/controls this enterprise re-design process? And, what other enterprise ontologism tools and methods do you know of?

Only this question could be a next future blog post. Yes, it is crucial to define a business ontology to support the modern flow of information through an enterprise. Products become systems, depending on direct feedback from the market. Only this last sentence already requires a redefinition of change processes, responsibilities. Next, the change towards data-granularity introduces new ways of automation, which we will address in the upcoming years. Initiatives like Industry 4.0 / Smart Manufacturing / IIoT all contribute to that. And then there is the need to communicate around a model instead of following the old documents path. Read more about it in Digital PLM requires a Model-Based Enterprise. To close this point:  I am not aware of anyone who has already worked and published experiences on this topic, in particular in the context of PLM.

 

Point 3

(Flip) Where to draw the PLM line in a digital enterprise? I personally think this barrier will vanish as Product Lifecycle Management (as a paradigm, not necessarily as a software) will provide companies with continuity, profitability and competitive advantage in the early 21st century. The PLM monolith might remain, but supported by an array of micro services inside and outside the company (next to IoT, hopefully also external data sets).

I believe there is no need to draw a PLM line. As Peter’s article: Digitalisation enabled by product lifecycle management already illustrated there is a need for a product information backbone along the whole (circular) lifecycle, where product information can interact with other enterprise platforms, like CRM, ERP and MES and BI services. Sometimes we will see overlapping functionality, sometimes we will see the need to bridge the information through Microservices. As long as these bridges are data-driven and do not need manual handling/transformation of data, they fit in the future, lean digital enterprise.

Conclusion:

This can be an ongoing dialogue, diving into detailed topics of a modern PLM approach. I am curious to learn from my readers, how engaged they are in this topic? Do you still take part in PLM dialogues or do you consume? Do you have “tips and tricks” for those who want to shape the future of PLM?


Let your voice be heard! (and give Flip a break)

 

simpleMy recent posts were around the words Simple (PLM is not simple) and Simplicity  (Human Beings, PLM and Simplicity).  Combined with a blog dialogue with Oleg Shilovitsky (Small manufacturers and search of simple solutions)  and comments to these posts, the theme Simple has been discussed in various ways. Simple should not be confused with Simplicity. The conclusion: A PLM implementation should reduce complexity for an organization, aiming for increasing simplicity. The challenge: Achieving more simplicity is not simple (the picture related to this paragraph)

What does simplicity mean in the context of PLM?

My definition would be that compared to the current state, the future state should bring measurable benefits by reducing or eliminating non-value added activities. Typical non-value added PLM activities are collecting data from various disciplines to get a management understanding, conversion of file formats to support other disciplines or collecting and distributing data for change and approval processes.

If you can reduce or eliminate these steps, significant benefits can be achieved: reducing iterations, increasing quality and (re)acting faster to changes. These benefits are the whole idea behind Digital PLM. See Accenture’s explanation or read my post: Best Practices or Next Practices.DigitalPLM

Simplicity comes from the fact that the user does not need to depend on intermediate people or data formats to have an understanding of “the best so far truth.” Empowered users are a characteristic of modern digital processes. Empowered users need to have different skills than persons working in a traditional environment where exchange and availability of information are more controlled through communication between silos.  Some people can make the change, some will never make the change.

What can you do?

On LinkedIn, I found some good suggestions from Peter Weis in his CIO article: The most painful, gut-wrenching part of leading transformation. Peter’s post is about the challenges within a company going through a transformation and to keep the pace. My favorite part:

For me, the most difficult and gut-wrenching part of leading our transformation was not the technology involved. It was making and acting on those tough decisions about who was not going to succeed. In some cases, people had been with the company for decades and had been rewarded and encouraged for the very work they were no longer required to do. These were good people, skilled talent, who provided a great service to the company – but the technology and the cultural gap were just too wide for them to bridge.

Peter describes a dilemma that many of us consultants should face when implementing a business change. Keeping on board all employees is a mission impossible. But what if you want to keep them all on board?

Reducing complexity by making the system rigid?

One of the companies, I am currently working with, decided to keep all employees on board by demanding for a PLM system that is so rigid and automated that a user cannot make mistakes or wrong decisions. For example: Instead of allowing the user to decide which approval path should be chosen, the predefined workflow should be started where all participants are selected by automation. The idea: reducing the complexity for the (older) user. The user does not have to learn how to navigate in a new environment to decide what is the best option. There is always one option. Simple isn’t it?

I believe it reduces any user to a person that clicks on buttons and writes some comments. It is not about real empowerment.

There are two downsides to this approach

  • To make the PLM system, so incredibly rigid additional customizations are needed (which come with a cost). However more costly will be the upgrades in the future and the maintenance of every change in business process which is hard coded currently.
  • The system will be so rigid that even future, more digital native users, will dislike the system as it does not challenge them to think. Implementing the past or pushing for the future?

My challenge:

  • A rigid system creates the illusion that the system is secure and simple for the existing employees (who you do not want to challenge to change)
  • A rigid system leads by default to complexity in the future with high costs of change.

I am curious to learn how you would approach my challenge (a PLM consultant’s challenge)
Making the customer happy or being the “bad news” guy who creates fear for the future?
I assume a topic many PLM consultants should face nowadays – your opinion?

Translate

Email subscription to this blog

Advertisements
%d bloggers like this: