You are currently browsing the tag archive for the ‘Digital Enterprise’ tag.

If you have followed my blog over the past 10 years, I hope you realize that I am always trying to bring sense to the nonsense and still looking into the future where new opportunities are imagined. Perhaps due to my Dutch background (our motto: try to be normal – do not stand out) and the influence of working with Israeli’s (a country where almost everyone is a startup).

Given this background, I enjoy the current discussion with Oleg Shilovitsky related to potential PLM disruptions. We worked for many years together at SmarTeam, a PDM/PLM disruptor at that time, in the previous century. Oleg has continued his passion for introducing potential disruptive solutions  (Inforbix / OpenBOM) where I got more and more intrigued by human behavior related to PLM. For that reason, I have the human brain in my logo.

Recently we started our “The death of ….” Dialogue, with the following episodes:

Jan 14thHow to democratize PLM knowledge and disrupt traditional consulting experience

Jan 21stThe death of PLM Consultancy

Jan 22ndWhy PLM consultants are questioning new tools and asking about cloud exit strategy?

Here is episode 4  – PLM Consultants are still alive and have an exit strategy

Where we agree

We agreed on the fact that traditional consultancy practices related to PLM ranking and selection processes are out of time. The Forester Wave publication was the cause of our discussion. For two reasons:

  1. All major PLM systems cover for 80 percent the same functionalities. Therefore there is no need to build, send and evaluate lengthy requirements lists to all potential candidates and then recommend on the preferred vendor. Waste of time as the besides the requirements there is much more to evaluate than just performing tool selection.
  2. Many major consultancy firms have PLM practices, most of the time related to the major PLM providers. Selecting one of the major vendors is usually not a problem for your reputation, therefore the importance of these rankings. Consultancy firms will almost never recommend disruptive tool-sets.

PLM businesses transformation

At this point, we are communicating at a different wavelength. Oleg talks about PLM business transformation as follows:

Cloud is transforming PLM business. Large on-premise PLM projects require large capital budget. It is a very good foundation for existing PLM consulting business. SaaS subscription is a new business model and it can be disruptive for lucrative consulting deals. Usually, you can see a lot of resistance when somebody is disrupting your business models. We’ve seen it in many places and industries. It happened with advertising, telecom and transportation. The time is coming to change PLM, engineering and manufacturing software and business.

I consider new business models less relevant compared to the need for a PLM practice transformation. Tools like Dropbox, perhaps disruptive for PDM systems, are tools that implement previous century methodology (document-driven / file-based models). We are moving from item-centric towards a model-driven future.

The current level of PLM practices is related to an item-centric approach, the domain where also OpenBOM is bringing disruption.
The future, however, is about managing complex products, where products are actually systems, a combination of hardware and software. Hardware and software have a complete different lifecycle, and all major PLM vendors are discovering an overall solution concept to incorporate both hardware and software. If you cannot manage software in the context of hardware in the future, you are at risk.  Each PLM vendor has a different focus area due to their technology history. I will address this topic during the upcoming PLMx conference in Hamburg. For a model-driven enterprise, I do not see an existing working combination of disruptors yet.

Cloud security and Cloud exit strategy

Oleg does not really see the impact of the cloud as related to the potential death of PLM consulting as you can read here:

I agree, cloud might be still not for everyone. But the adoption of cloud is growing and it is becoming a viable business model and technology for many companies. I wonder how “cloud” problem is related to the discussion about the death of PLM consulting. And…  here is my take on this. It is all about business model transformation.

I am not convinced that in the PLM cloud is the only viable business model. Imagine an on-premise rigid PLM system. Part of the cloud-based implementation benefits come from low upfront costs and scalable IT. However, cloud also pushes companies to defend a no-customization strategy – configuration of the user interface only.  This is a “secret” benefit for cloud PLM vendors as they can say “NO” to the end users of course within given usability constraints. Saying “NO” to the customer is lesson one for every current PLM implementation as everyone knows the problem of costly upgrades later

Also, make a 5-10 years cost evaluation of your solution and take the risk of raising subscription fees into account. No vendor will drop the price unless forced by the outside world. The initial benefits will be paid back later because of the other business model.

Cloud exit strategy and standards

When you make a PLM assessment, and usually experienced PLM consultants do this, there is a need to consider an exit strategy. What happens if your current PLM cloud vendor(s) stops to exist or migrate to a new generation of technology and data-modeling? Every time when new technology was introduced, we thought it was going to be THE future. The future is unpredictable. However, I can predict that in 10 years from now we live with different PLM concepts.

There will be changes and migrations and cloud PLM vendors will never promote standardized exports methods (unless forced) to liberate the data in the system. Export tools could be a niche market for PLM partners, who understand data standards. Håkan Kårdén, no finders fee required, however, Eurostep has the experience in-house.

 

Free downloads – low barriers to start

A significant difference in opinion between Oleg and me is Oleg’s belief in bottom-up, DIY PLM as part of PLM democratization and my belief in top-down business transformation supported by PLM. When talking about Aras, Autodesk, and OpenBOM,  Oleg states:

All these tools have one thing in common. You can get the tool or cloud services for free and try it by yourself before buying. You can do it with Aras Innovator, which can be downloaded for free using enterprise open source. You can subscribe for Autodesk Fusion Lifecycle and OpenBOM for trial and free subscriptions. It is different from traditional on-premise PLM tools provided by big PLM players. These tools require months and sometimes even years of planning and implementation including business consulting and services.

My experience with SmarTeam might influence this discussion. SmarTeam was also a disruptive PDM solution thanks to its easy data-modeling and Microsoft-based customization capabilities like Aras. Customers and implementers could build what they want, you only needed to know Visual Basic. As I have supported the field mitigating installed SmarTeam implementations, often the problem was SmarTeam has been implemented as a system replicating/automating current practices.

Here Henry Ford’s statement as shown below applies:

Implementations became troublesome when SmarTeam provided new and similar business logic. Customers needed to decide to use OOTB features and de-customize or not benefits from new standard capabilities. SmarTeam had an excellent business model for service providers and IT-hobbyists/professionals in companies. Upgrade-able SmarTeam implementations where those that remained close to the core, but meanwhile we were 5 – 8 years further down the line.

I believe we still need consultants to help companies to tell and coach them towards new ways of working related to the current digitization. Twenty years old concepts won’t work anymore. Consultants need a digital mindset and think holistic. Fitting technology and tools will be there in the future.

Conclusion

The discussion is not over, and as I reached already more than 1000 words, I will stop. Too many words already for a modern pitch, not enough for a balanced debate. Oleg and I will continue in Hamburg, and we both hope others will chime in, providing balanced insights in this discussion.

To be continued …..?

 

Advertisements

Happy New Year to all of you. A new year comes traditionally with good intentions for the upcoming year.  I would like to share my PLM intentions for this year with you and look forward to your opinion. I shared some of my 2017 thoughts in my earlier post: Time for a Break. This year will I focus on the future of PLM in a digital enterprise, current PLM practices and how to be ready for the future.

Related to these activities I will zoom in on people-related topics, like organizational change, business impact and PLM justification in an enterprise. When it happens during the year, or based on your demands, I will zoom in on architectural stuff and best practices.

The future of PLM

Accenture – Digital PLM

At this moment digital transformation is on the top of the hype curve and the impact varies of course per industry. For sure at the company’s C-level managers will be convinced they have the right vision and the company is on the path to success.

Statements like: “We will be the first digital industrial enterprise” or “We are now a software company” impress the outside world and often investors in the beginning.

 

Combined with investments in customer related software platforms a new digital world is relative fast created facing the outside world.  And small pilots are celebrated as significant successes.

What we do not see is that to show and reap the benefits of digital transformation companies need to do more than create a modern, outside facing infrastructure. We need to be able to connect and improve the internal data flow in an efficient way to stay competitive. Buzzwords like digital thread and digital twin are relevant here.

To my understanding we are still in the early phases of discovering the ideal architecture and practices for a digital enterprise. PLM Vendors and technology companies show us the impressive potential as-if the future already exists already now. Have a reality check from Marc Halpern (Gartner) in this article on engineering.com – Digital Twins: Beware of Naive Faith in Simplicity.

I will focus this year on future PLM combined with reality, hopefully with your support for real cases.

Current PLM practices

Although my curiosity is focused on future PLM, there is still a journey to go for companies that have just started with PLM.  Before even thinking of a digital enterprise, there is first a need to understand and implement PLM as an infrastructure outside the engineering department.

Many existing PLM implementations are actually more (complex) document management systems supporting engineering data, instead of using all available capabilities of a modern PLM systems. Topics like Systems Engineering, multidisciplinary collaboration, Model-Based Enterprise, EBOM-MBOM handling, non-intelligent numbering are all relevant for current and future PLM.

Not exploring and understanding them in your current business will make the gap towards the future even bigger. Therefore, keep on sending your questions and when time allows I will elaborate. For example, see last year’s PLM dialogue – you find these posts here: PLM dialogue and PLM dialogue (continued). Of course I will share my observations in this domain too when I bump into them.

 

To be ready for the future

The most prominent challenge for most companies however is how to transform their existing business towards a modern digital business where new processes and business opportunities need to be implemented inside an existing enterprise. These new processes and business opportunities are not just simple extensions of the current activities, they need new ways of working like delivering incremental results through agile and multidisciplinary teams. And these ways of working combined with never-existing-before interactivity with the market and the customer.

How to convince management that these changes are needed and do not happen without their firm support? It is easier to do nothing and push for small incremental changes. But will this be fast enough? Probably not as you can read from research done by strategic consultancy firms. There is a lot of valuable information available if you invest time in research. But spending time is a challenge for management.

I hope to focus on these challenges too, as all my clients are facing these challenges. Will I be able to help them? I will share successes and pitfalls with you, combined supporting information that might be relevant for others

Your input?

A blog is a modern way of communicating with anyone connected in the world. What I would like to achieve this year is to be more interactive. Share your questions – there are no stupid questions as we are all learning. By sharing and learning we should be able to make achievable steps and become PLM winners.

Best wishes to us all and be a winner not a tweeter …..

 

 

Dear readers, it is time for me to relax and focus on Christmas and a New Year upcoming. I realize that not everyone who reads my posts will be in the same mood. You might have had your New Year three months ago or have New Year coming up in a few months. This is the beauty and challenge of a global, multicultural diverse society. Imagine we are all doing the same, would you prefer such a world ? Perhaps it would give peace to the mind (no surprises, everything predictable) however for human survival we need innovation and new ways of life.

This mindset is also applicable to manufacturing companies. Where in the past companies were trying to optimize and standardize their processes driven by efficiency and predictability, now due to the dynamics of a globally connected world, businesses need to become extremely flexible however still reliable and profitable.

How will they make the change ?

Digital transformation is one of the buzz words pointing to the transition process. Companies need to go through a change to become flexible for the future and deliver products or solutions for the individual customer. Currently companies invest in digital transformation, most of the time in areas that bring direct visibility to the outside world or their own management, not necessarily delivering profitable results as a recent article from McKinsey illustrated: The case for digital reinvention.

And for PLM ?

I have investigated digital transformation in relation to PLM  with particular interest this year as I worked with several companies that preached to the outside world that they are changing or were going to make a change. However what is happening at the PLM level ? Most of the time nothing. Some new tools, perhaps some new disciplines like software engineering become more critical. However the organization and people do not change their ways of working as in particular the ongoing business and related legacy are blocking the change.

Change to ?

This is another difficult question to answer.  There is no clearly defined path to share. Yes, modern PLM will be digital PLM, it will be about data-driven connected information. A final blueprint for digital PLM does not exist yet. We are all learning and guessing.  You can read my thoughts here:

Software vendors in various domains are all contributing to support a modern digital product innovation management future. But where to start?  Is it the product innovation platform? Is it about federated solutions? Model-Based? Graph-databases? There are even people who want to define the future of PLM.  We can keep throwing pieces of the puzzle on the table, but all these pieces will not lead to a single solved puzzle. There will be different approaches based on your industry and your customers. Therefore, continuous learning and investing time to understand the digital future is crucial. This year’s PDT Europe conference was an excellent event to learn and discuss the themes around a model-based lifecycle enterprise. You can read my reviews here: The weekend after PDT Europe 2017 part 1 and part 2.

The next major event where I plan to discuss and learn about modern PLM topics is the upcoming PI PLMx event in Hamburg on February 19-20 organized by MarketKey. Here I will discuss the Model-Based Enterprise and lecture about the relation between PLM and digital transformation. Hoping to see some of you there for exciting discussions and actions.

Conclusion

Merry Christmas for those who are celebrating and a happy, healthy and prosperous 2018 to all of you. Thanks for your feedback. Keep on asking questions or propose other thoughts as we are all learning. The world keeps on turning, however for me the next two weeks will the time relax.

Talk to you in 2018 !

 

 

For those who have followed my blog over the years, it must be clear that I am advocating for a digital enterprise explaining benefits of a data-driven approach where possible. In the past month an old topic with new insights came to my attention: Yes or No intelligent Part Numbers or do we mean Product Numbers?

 

 

What’s the difference between a Part and a Product?

In a PLM data model, you need to have support for both Parts and Products and there is a significant difference between these two types of business objects. A Product is an object facing the outside world, which can be a company (B2B) or customer (B2C) related. Examples of B2C products are the Apple iPhone 8, the famous IKEA Billy, or my Garmin 810 and my Dell OptiPlex 3050 MFXX8.  Examples of B2B products are the ABB synchronous motor AMZ 2500, the FESTO standard cylinder DSBG.  Products have a name and if there are variants of the product, they also have an additional identifier.

A Part represents a physical object that can be purchased or manufactured. A combination of Parts appears in a BOM. In case these Parts are not yet resolved for manufacturing, this BOM might be the Engineering BOM or a generic Manufacturing BOM. In case the Parts are resolved for a specific manufacturing plant, we talk about the MBOM.

I have discussed the relation between Parts and Products in a earlier post Products, BOMs and Parts which was a follow-up on my LinkedIn post, the importance of a PLM data model. Although both posts were written more than two years ago, the content is still valid. In the upcoming year, I will address this topic of products further, including software and services moving to solutions / experiences.

Intelligent number for Parts?

As parts are company internal business objects, I would like to state if the company is serious about becoming a digital enterprise, parts should have meaningless unique identifiers. Unique identifiers are the link between discipline or application specific data sets. For example, in the image below, where I imagined attributes sets for a part, based on engineering and manufacturing data sets.

Apart from the unique ID, there might be a common set of attributes that will be exposed in every connected system. For example, a description, a classification and one or more status attributes might be needed.

Note 1: A revision number is not needed when you create every time a new unique ID for a new version of the part.  This practice is already common in the electronics industry. In the old mechanical domain, we are used to having revisions in particular for make parts based on Form-Fit-Function rules.

Note 2: The description might be generated automatically based on a concatenation of some key attributes.

Of course if you are aiming for a full digital enterprise, and I think you should, do not waste time fixing the past. In some situations, I learned that an external consultant recommended the company to rename their old meaningful part numbers to the new non-intelligent part numbering scheme. There are two mistakes here. Renumbering is too costly, as all referenced information should be updated. And secondly as long as the old part numbers have a unique ID for the enterprise, there is no need to change. The connectivity of information should not depend on how the unique ID is formatted.

Read more if you want here: The impact of Non-Intelligent Part Numbers

Intelligent numbers for Products?

If the world was 100 % digital and connected, we could work with non-intelligent product numbers. However, this is a stage beyond my current imagination.  For products we will still need a number that allows customers to refer to, for when they communicate with their supplier / vendor or service provider. For many high-tech products the product name and type might be enough. When I talk about the Samsung S5 G900F 16G, the vendor knows which kind of configuration I am referring too. Still it is important to realize that behind these specifications, different MBOMs might exist due to different manufacturing locations or times.

However, when I refer to the IKEA Billy, there are too many options to easily describe the right one consistent in words, therefore you will find a part number on the website, e.g. 002.638.50. This unique ID connects directly to a single sell-able configuration. Here behind this unique ID also different MBOMs might exist for the same reason as for the Samsung telephone. The number is a connection to the sales configuration and should not be too complicated as people need to be able to read and recognize it when you go to a warehouse.

Conclusion

There is a big difference between Product and Part numbers because of the intended scope of these business objects. Parts will soon exist in connected, digital enterprises and therefore do not need any meaningful number anymore. Products need to be identified by consumers anywhere around the world, not yet able or willing to have a digital connection with their vendors. Therefore smaller and understandable numbers will remain needed to support exact communication between consumer and vendor.

When I started working with SmarTeam Corp.  in 1999, the company had several product managers, who were responsible for the whole lifecycle of a component or technology. The Product Manager was the person to define the features for the new release and provide the justification for these new features internally inside R&D.  In addition the Product Manager had the external role to visit customers and understand their needs for future releases and building and explaining a coherent vision to the outside and internal world. The product manager had a central role, connecting all stakeholders.

In the ideal situation the Product Manager was THE person who could speak in R&D-language about the implementation of features, could talk with marketing and documentation teams to explain the value and expected behavior and could talk with the customer describing the vision, meanwhile verifying the product’s vision and roadmap based on their inputs.All these expected skills make the role of a product manager challenging. Is the person too “techy” than he/she will enjoy working with R&D but have a hard time understanding customer demands. From the other side if the Product Manager is excellent in picking-up customer and market feedback he/she might not be heard and get the expected priorities from R&D. For me, it has always been clear that in software world a “bi-directional” Product Manager is crucial to success.

Where are the Product Managers in the Manufacturing Industry?

Approximate four years ago new concepts related to digitalization for PLM became more evident. How could a digital continuity connect the various disciplines around the product lifecycle and therefore provide end-to-end visibility and traceability? When speaking of end-to-end visibility most of the time companies talked about the way they designed and delivered products, visibility of what is happening stopped most of the time after manufacturing. The diagram to the left, showing a typical Build To Order organization illustrates the classical way of thinking. There is an R&D team working on Innovation, typically a few engineers and most of the engineers are working in Sales Engineering and Manufacturing Preparation to define and deliver a customer specific order. In theory, once delivered none of the engineers will be further involved, and it is up to the Service Department to react to what is happening in the field.

A classical process in the PLM domain is the New Product Introduction process for companies that deliver products in large volumes to the market, most of the time configurable to be able to answer to various customer or pricing segments. This process is most of the time linear and is either described in one stream or two parallel streams. In the last case, the R&D department develops new concepts and prepares the full product for the market. However, the operational department starts in parallel, initially involved in strategic sourcing, and later scaling-up manufacturing disconnected from R&D.

I described these two processes because they both illustrate how disconnected the source (R&D/ Sales)  are from the final result in the field. In both cases managed by the service department. A typical story that I learned from many manufacturing companies is that at the end it is hard to get a full picture from what is happening across the whole lifecycle, How external feedback (market & customers) have the option to influence at any stage is undefined. I used the diagram below even  before companies were even talking about a customer-driven digital transformation. Just understanding end-to-end what is happening with a product along the lifecycle is already a challenge for a company.

Putting the customer at the center

Modern business is about having customer or market involvement in the whole lifecycle of the product. And as products become more and more a combination of hardware and software, it is the software that allows the manufacturer to provide incremental innovation to their products. However, to innovate in a manner that is matching or even exceeding customer demands, information from the outside world needs to travel as fast as possible through an organization. In case this is done in isolated systems and documents, the journey will be cumbersome and too slow to allow a company to act fast enough. Here digitization comes in, making information directly available as data elements instead of documents with their own file formats and systems to author them. The ultimate dream is a digital enterprise where date “flows”, advocated already by some manufacturing companies for several years.

In the previous paragraph I talked about the need to have an infrastructure in place for people in an organization to follow the product along the complete lifecycle, to be able to analyze and improve the customer experience. However, you also need to create a role in the organization for a person to be responsible for combining insights from the market and to lead various disciplines in the organization, R&D, Sales, Services. And this is precisely the role of a Product Manager.

Very common in the world of software development, not yet recognized in manufacturing companies. In case a product manager role exists already in your organization, he/she can tell you how complicated it currently is to get an overall view of the product and which benefits a digital infrastructure would bring for their job. Once the product manager is well-supported and recognized in the organization, the right skill set to prioritize or discover actions/features will make the products more attractive for consumers. Here the company will benefit.

Conclusion

If your company does not have the role of a product manager in place, your business is probably not yet well enough engaged in the customer journey.  There will be broken links and costly processes to get a fast response to the market.  Consider the role of a Product Manager, which will emerge as seen from the software business.

NOTE 1: Just before publishing this post I read an interesting post from Jan Bosch: Structure Eats Strategy. Well fitting in this context

NOTE 2: The existence of a Product Manager might be a digital maturity indicator for a company, like for classical PLM maturity, the handling of the MBOM (PDM/PLM/ERP) gives insight into PLM maturity of a company.

Related to the MBOM, please read: The Importance of a PLM data model – EBOM and MBOM

 

 

 

 

 

This post is a rewrite of an article I wrote on LinkedIn two years ago and modified it to my current understanding. When you are following my blog, in particular, the posts related to the business change needed to transform a company towards a data-driven digital enterprise, one of the characteristics of digital is about the real-time availability of information. This has an impact on everyone working in such an organization. My conversations are in the context of PLM (Product Lifecycle Management) however I assume my observations are valid for other domains too.

Real-time visibility is going to be the big differentiator for future businesses, and in particular, in the PLM domain, this requires a change from document-centric processes towards data-driven processes.

Documents have a lot of disadvantages.  Documents lock information in a particular format and document handling results in sequential processes, where one person/one discipline at the time is modifying or adding content. I described the potential change in my blog post: From a linear world to fast and circular?

From a linear world to fast and circular

In that post, I described that a more agile and iterative approach to bring products and new enhancements to the market should have an impact on current organizations. A linear organization, where products are pushed to the market, from concept to delivery, is based on working in silos and will be too slow to compete against future, modern digital enterprises. This because departmental structures with their own hierarchy block fast moving of information, and often these silos perform filtering/deformation of the information.  It becomes hard to have a single version of the truth as every department, and its management will push for their measured truth.

A matching business model related to the digital enterprise is a matrix business model, where multi-disciplinary teams work together to achieve their mission. An approach that is known in the software industry, where parallel and iterative work is crucial to continuous deliver incremental benefits.

Image:  21stcenturypublicservant.wordpress.com/

In a few of my projects, I discovered this correlation with software methodology that I wanted to share. One of my clients was in the middle of moving from a document-centric approach toward a digital information backbone, connecting the RFQ phase and conceptual BOM through design, manufacturing definition, and production. The target was to have end-to-end data continuity as much as possible, meanwhile connecting the quality and project tasks combined with issues to this backbone.

The result was that each individual had a direct view of their current activities, which could be a significant quantity for some people engaged in multiple projects.  Just being able to measure these numbers already lead to more insight into an individual’s workload. At the time we discussed with the implementation team the conceptual dashboard for an individual, it lead to questions like: “Can the PLM system escalate tasks and issues to the relevant manager when needed?” and  “Can this escalation be done automatically? “

And here we started the discussion. “Why do you want to escalate to a manager?”  Escalation will only give more disruption and stress for the persons involved. Isn´t the person qualified enough to make a decision what is important?

One of the conclusions of the discussion was that currently, due to lack of visibility of what needs to be done and when and with which urgency, people accept things get overlooked. So the burning issues get most of the attention and the manager’s role is to make things burning to get it done.

When discussing further, it was clear that thanks to the visibility of data, real critical issues will appear at the top of an individual’s dashboard. The relevant person can immediately overlook what can be achieved and if not, take action. Of course, there is the opportunity to work on the easy tasks only and to ignore the tough ones (human behavior) however the dashboard reveals everything that needs to be done – visibility. Therefore if a person learns to manage their priorities, there is no need for a manager to push anymore, saving time and stress.

The ultimate conclusion of our discussion was: Implementing a modern PLM environment brings first of all almost 100 % visibility, the single version of the truth. This new capability breaks down silos, a department cannot hide activities behind their departmental wall anymore. Digital PLM allows horizontal multidisciplinary collaboration without the need going through the management hierarchy.

It would mean Power to People, in case they are stimulated to do so. And this was the message to the management: “ you have to change too, empower your people.”

What do you think – will this happen? This was my question in 2015.  Now two years later I can say some companies have seen the potential of the future and are changing their culture to empower their employees working in multidisciplinary teams. Other companies, most of the time with a long history in business, are keeping their organizational structure with levels of middle management and maintain a culture that consolidates the past.

Conclusion

A digital enterprise empowers individuals allowing companies to become more proactive and agile instead of working within optimized silos. In silos, it appears that middle management does not trust individuals to prioritize their work.  The culture of a company and its ability to change are crucial for the empowerment of individuals The last two years there is progress in understanding the value of empowered multidisciplinary teams.

Is your company already empowering people ? Let us know !

Note: After speaking with Simon, one of my readers who always gives feedback from reality, we agreed that multidisciplinary teams are very helpful for organizations. However you will still need a layer of strategic people securing standard ways of working and future ways of working as the project teams might be to busy doing their job. We agreed this is the role for modern middle management.
DO YOU AGREE ?

Last week I posted my first review of the PDT Europe conference. You can read the details here: The weekend after PDT Europe (part 1).  There were some questions related to the abbreviation PDT. Understanding the history of PDT, you will discover it stands for Product Data Technology. Yes, there are many TLA’s in this world.

Microsoft’s view on the digital twin

Now back to the conference. Day 2 started with a remote session from Simon Floyd. Simon is Microsoft’s Managing Director for Manufacturing Industry Architecture Enterprise Services and a frequent speaker at PDT. Simon shared with us Microsoft’s viewpoint of a Digital Twin, the strategy to implement a Digit Twin, the maturity status of several of their reference customers and areas these companies are focusing. From these customers it was clear most companies focused on retrieving data in relation to maintenance, providing analytics and historical data. Futuristic scenarios like using the digital twin for augmented reality or design validation. As I discussed in the earlier post, this relates to my observations, where creating a digital thread between products in operations is considered as a quick win. Establishing an end-to-end relationship between products in operation and their design requires many steps to fix. Read my post: Why PLM is the forgotten domain in digital transformation.

When discussing the digital twin architecture, Simon made a particular point for standards required to connect the results of products in the field. Connecting a digital twin in a vendor-specific framework will create a legacy, vendor lock-in, and less open environment to use digital twins. A point that I also raised in my presentation later that day.

Simon concluded with a great example of potential future Artificial Intelligence, where an asset based on its measurements predicts to have a failure before the scheduled maintenance stop and therefore requests to run with a lower performance so it can reach the maintenance stop without disruption.

Closing the lifecycle loop

Sustainability and the circular economy has been a theme at PDT for some years now too. In his keynote speech, Torbjörn Holm from Eurostep took us through the global megatrends (Hay group 2030) and the technology trends (Gartner 2018) and mapped out that technology would be a good enabler to discuss several of the global trends.

Next Torbjörn took us through the reasons and possibilities (methodologies and tools) for product lifecycle circularity developed through the ResCoM project in which Eurostep participated.

The ResCoM project (Resource Conservative Manufacturing) was a project co-funded by the European Commission and recently concluded. More info at www.rescom.eu

Torbjörn concluded discussing the necessary framework for Digital Twin and Digital Thread(s), which should be based on a Model-Based Definition, where ISO 10303 is the best candidate.

Later in the afternoon, there were three sessions in a separate track, related to design optimization for value, circular and re-used followed by a panel discussion. Unfortunate I participated in another track, so I have to digest the provided materials still. Speakers in that track were Ola Isaksson (Chalmers University), Ingrid de Pauw & Bram van der Grinten (IDEAL&CO) and Michael Lieder (KTH Sweden)

Connecting many stakeholders

Rebecca Ihrfors, CIO from the Swedish Defense Material Administration (FMV) shared her plans on transforming the IT landscape to harmonize the current existing environments and to become a broker between industry and the armed forces (FM). As now many of the assets come with their own data sets and PDM/PLM environments, the overhead to keep up all these proprietary environments is too expensive and fragmented. FWM wants to harmonize the data they retrieve from industry and the way they offer it to the armed forces in a secure way. There is a need for standards and interoperability.

The positive point from this presentation was that several companies in the audience and delivering products to Swedish Defense could start to share and adapt their viewpoints how they could contribute.

Later in the afternoon, there were three sessions in a separate track rented to standards for MBE inter-operability and openness that would fit very well in this context. Brian King (Koneksys), Adrian Murton (Airbus UK) and Magnus Färneland (Eurostep) provided various inputs, and as I did not attend these parallel sessions I will dive deeper in their presentations at a later time

PLM something has to change – bimodal and more

In my presentation, which you can download from SlideShare here: PLM – something has to change. My main points were related to the fact that apparently, companies seem to understand that something needs to happen to benefit really from a digital enterprise. The rigidness from large enterprise and their inhibitors to transform are more related to human and incompatibility issues with the future.

How to deal with this incompatibility was also the theme for Martin Eigner’s presentation (System Lifecycle Management as a bimodal IT approach) and Marc Halpern’s closing presentation (Navigating the Journey to Next Generation PLM).

Martin Eigner’s consistent story was about creating an extra layer on top of the existing (Mode 1) systems and infrastructure, which he illustrated by a concept developed based on Aras.

By providing a new digital layer on top of the existing enterprise, companies can start evolving to a modern environment, where, in the long-term, old Mode 1 systems will be replaced by new digital platforms (Mode 2). Oleg Shilovitsky wrote an excellent summary of this approach. Read it here: Aras PLM  platform “overlay” strategy explained.

Marc Halpern closed the conference describing his view on how companies could navigate to the Next Generation PLM by explaining in more detail what the Gartner bimodal approach implies. Marc’s story was woven around four principles.

Principle 1 The bimodal strategy as the image shows.

Principle 2 was about Mode 1 thinking in an evolutionary model. Every company has to go through maturity states in their organization, starting from ad-hoc, departmental, enterprise-based to harmonizing and fully digital integrated. These maturity steps also have to be taken into account when planning future steps.

Principle 3 was about organizational change management, a topic often neglected or underestimated by product vendors or service providers as it relates to a company culture, not easy to change and navigate in a particular direction.

Finally, Principle 4 was about Mode 2 activities. Here an organization should pilot (in a separate environment), certify (make sure it is a realistic future), adopt (integrate it in your business) and scale (enable this new approach to exists and grow for the future).

Conclusions

This post concludes my overview of PDT Europe 2017. Looking back there was a quiet aligned view of where we are all heading with PLM and related topics. There is the hype an there is reality, and I believe this conference was about reality, giving good feedback to all the attendees what is really happening and understood in the field. And of course, there is the human factor, which is hard to influence.

Share your experiences and best practices related to moving to the next generation of PLM (digital PLM ?) !

 

 

 

As I am preparing my presentation for the upcoming PDT Europe 2017 conference in Gothenburg, I was reading relevant experiences to a data-driven approach. During PDT Europe conference we will share and discuss the continuous transformation of PLM to support the Lifecycle Model-Based Enterprise. 

One of the direct benefits is that a model-based enterprise allows information to be shared without the need to have documents to be converted to a particular format, therefore saving costs for resources and bringing unprecedented speed for information availability, like what we are used having in a modern digital society.

For me, a modern digital enterprise relies on data coming from different platforms/systems and the data needs to be managed in such a manner that it can serve as a foundation for any type of app based on federated data.

This statement implies some constraints. It means that data coming from various platforms or systems must be accessible through APIs / Microservices or interfaces in an almost real-time manner. See my post Microservices, APIs, Platforms and PLM Services. Also, the data needs to be reliable and understandable for machine interpretation. Understandable data can lead to insights and predictive analysis. Reliable and understandable data allows algorithms to execute on the data.

Classical ECO/ECR processes can become highly automated when the data is reliable, and the company’s strategy is captured in rules. In a data-driven environment, there will be much more granular data that requires some kind of approval status. We cannot do this manually anymore as it would kill the company, too expensive and too slow. Therefore, the need for algorithms.

What is understandable data?

I tried to avoid as long as possible academic language, but now we have to be more precise as we enter the domain of master data management. I was triggered by this recent post from Gartner: Gartner Reveals the 2017 Hype Cycle for Data Management. There are many topics in the hype cycle, and it was interesting to see Master Data Management is starting to be taken seriously after going through inflated expectations and disillusionment.

This was interesting as two years ago we had a one-day workshop preceding PDT Europe 2015, focusing on Master Data Management in the context of PLM. The attendees at that workshop coming from various companies agreed that there was no real MDM for the engineering/manufacturing side of the business. MDM was more or less hijacked by SAP and other ERP-driven organizations.

Looking back, it is clear to me why in the PLM space MDM was not a real topic at that time. We were still too much focusing and are again too much focusing on information stored in files and documents. The only area touched by MDM was the BOM, and Part definitions as these objects also touch the ERP- and After Sales-  domain.

Actually, there are various MDM concepts, and I found an excellent presentation from Christopher Bradley explaining the different architectures on SlideShare: How to identify the correct Master Data subject areas & tooling for your MDM initiative. In particular, I liked the slide below as it comes close to my experience in the process industry

Here we see two MDM architectures, the one of the left driven from ERP. The one on the right could be based on the ISO-15926 standard as the process industry has worked for over 25 years to define a global exchange standard and data dictionary. The process industry was able to reach such a maturity level due to the need to support assets for many years across the lifecycle and the relatively stable environment. Other sectors are less standardized or so much depending on new concepts that it would be hard to have an industry-specific master.

PLM as an Application Specific Master?

If you would currently start with an MDM initiative in your company and look for providers of MDM solution, you will discover that their values are based on technology capabilities, bringing data together from different enterprise systems in a way the customer thinks it should be organized. More a toolkit approach instead of an industry approach. And in cases, there is an industry approach it is sporadic that this approach is related to manufacturing companies. Remember my observation from 2015: manufacturing companies do not have MDM activities related to engineering/manufacturing because it is too complicated, too diverse, too many documents instead of data.

Now with modern digital PLM, there is a need for MDM to support the full digital enterprise. Therefore, when you combine the previous observations with a recent post on Engineering.com from Tom Gill: PLM Initiatives Take On Master Data Transformation I started to come to a new hypotheses:

For companies with a model-based approach that has no MDM in place, the implementation of their Product Innovation Platform (modern PLM) should be based on the industry-specific data definition for this industry.

Tom Gill explains in his post the business benefits and values of using the PLM as the source for an MDM approach. In particular, in modern PLM environments, the PLM data model is not only based on the BOM.  PLM now encompasses the full lifecycle of a product instead of initially more an engineering view. Modern PLM systems, or as CIMdata calls them Product Innovation Platforms, manage a complex data model, based on a model-driven approach. These entities are used across the whole lifecycle and therefore could be the best start for an industry-specific MDM approach. Now only the industries have to follow….

Once data is able to flow, there will be another discussion: Who is responsible for which attributes. Bjørn Fidjeland from plmPartner recently wrote: Who owns what data when …?  The content of his post is relevant, I only would change the title: Who is responsible for what data when as I believe in a modern digital enterprise there is no ownership anymore – it is about sharing and responsibilities

 

Conclusion

Where MDM in the past did not really focus on engineering data due to the classical document-driven approach, now in modern PLM implementations, the Master Data Model might be based on the industry-specific data elements, managed and controlled coming from the PLM data model

 

Do you follow my thoughts / agree ?

 

 

Last week I published a dialogue I had with Flip van der Linden, a fellow Dutchman and millennial, eager to get a grip on current PLM. You can read the initial post here: A PLM dialogue.  In the comments, Flip continued the discussion (look here).  I will elaborate om some parts of his comments and hope some others will chime in. It made me realize that in the early days of blogging and LinkedIn, there were a lot of discussions in the comments. Now it seems we become more and more consumers or senders of information, instead of having a dialogue. Do you agree? Let me know.

Point 1

(Flip) PLM is changing – where lies the new effort for (a new generation of) PLM experts.  I believe a huge effort for PLM is successful change management towards ‘business Agility.’ Since a proper response to an ECR/ECO would evidently require design changes impacting manufacturing and even after-sales and/or legal.  And that’s just the tip of the iceberg.

 

You are right, the main challenge for future PLM experts is to explain and support more agile processes, mainly because software has become a major part of the solution. The classical, linear product delivery approach does not match the agile, iterative approach for software deliveries. The ECR/ECO process has been established to control hardware changes, in particular because there was a big impact on the costs. Software changes are extremely cheap and possible fast, leading to different change procedures. The future of PLM is about managing these two layers (hardware/software) together in an agile way. The solution for this approach is that people have to work in multi-disciplinary teams with direct (social) collaboration and to be efficient this collaboration should be done in a digital way.

A good article to read in this context is Peter Bilello’s article: Digitalisation enabled by product lifecycle management.

 

(Flip) What seems to be missing is an ‘Archetype’ of the ideal transformed organization. Where do PLM experts want to go with these businesses in practice? Personally, I imagine a business where DevOps is the standard, unique products have generic meta-data, personal growth is an embedded business process and supply chain related risks are anticipated on and mitigated through automated analytics. Do you know of such an evolved archetypal enterprise model?

I believe the ideal archetype does not exist yet. We are all learning, and we see examples from existing companies and startups pitching their story for a future enterprise. Some vendors sell a solution based on their own product innovation platform, others on existing platforms and many new vendors are addressing a piece of the puzzle, to be connected through APIs or Microservices. I wrote about these challenges in Microservices, APIs, Platforms and PLM Services.  Remember, it took us “old PLM experts” more than 10-15 years to evolve from PDM towards PLM, riding on an old linear trajectory, caught up by a new wave of iterative and agile processes. Now we need a new generation of PLM experts (or evolving experts) that can combine the new concepts and filter out the nonsense.

Point 2

(Flip) But then given point 2: ‘Model-based enterprise transformations,’ in my view, a key effort for a successful PLM expert would also be to embed this change mgt. as a business process in the actual Enterprise Architecture. So he/she would need to understand and work out a ‘business-ontology’ (Dietz, 2006) or similar construct which facilitates at least a. business processes, b. Change (mgt.) processes, c. emerging (Mfg.) technologies, d. Data structures- and flows, e. implementation trajectory and sourcing.

And then do this from the PLM domain throughout the organization per optimization.  After all a product-oriented enterprise revolves around the success of its products, so eventually, all subsystems are affected by the makeup of the product lifecycle. Good PLM is a journey, not a trip. Or, does a PLM expert merely facilitates/controls this enterprise re-design process? And, what other enterprise ontologism tools and methods do you know of?

Only this question could be a next future blog post. Yes, it is crucial to define a business ontology to support the modern flow of information through an enterprise. Products become systems, depending on direct feedback from the market. Only this last sentence already requires a redefinition of change processes, responsibilities. Next, the change towards data-granularity introduces new ways of automation, which we will address in the upcoming years. Initiatives like Industry 4.0 / Smart Manufacturing / IIoT all contribute to that. And then there is the need to communicate around a model instead of following the old documents path. Read more about it in Digital PLM requires a Model-Based Enterprise. To close this point:  I am not aware of anyone who has already worked and published experiences on this topic, in particular in the context of PLM.

 

Point 3

(Flip) Where to draw the PLM line in a digital enterprise? I personally think this barrier will vanish as Product Lifecycle Management (as a paradigm, not necessarily as a software) will provide companies with continuity, profitability and competitive advantage in the early 21st century. The PLM monolith might remain, but supported by an array of micro services inside and outside the company (next to IoT, hopefully also external data sets).

I believe there is no need to draw a PLM line. As Peter’s article: Digitalisation enabled by product lifecycle management already illustrated there is a need for a product information backbone along the whole (circular) lifecycle, where product information can interact with other enterprise platforms, like CRM, ERP and MES and BI services. Sometimes we will see overlapping functionality, sometimes we will see the need to bridge the information through Microservices. As long as these bridges are data-driven and do not need manual handling/transformation of data, they fit in the future, lean digital enterprise.

Conclusion:

This can be an ongoing dialogue, diving into detailed topics of a modern PLM approach. I am curious to learn from my readers, how engaged they are in this topic? Do you still take part in PLM dialogues or do you consume? Do you have “tips and tricks” for those who want to shape the future of PLM?


Let your voice be heard! (and give Flip a break)

 

clip_image002It is already the 6th consecutive year that MarketKey organized the Product Innovation conference with its primary roots in PLM. For me, the PI conferences have always been a checkpoint for changes and progress in the field.

This year about 100 companies participated in the event with the theme: Digital Transformation. From Hype to Value? Sessions were split into three major streams: digital transformation, extended PLM, and Business Enabled Innovation larded with general keynote speeches. I wanted to attend all sessions (and I will do virtually later through PI.TV), but in this post, my observations are from the event highlights from the extended PLM sessions.

From iCub to R1

ittGiorgio Metta gave an overview of the RobotCub project, where teams are working on developing a robot that can support human beings in our day-to-day live. Some of us are used to industrial robots and understand their constraints. A robot to interact with human beings is extreme more complex, and its development is still in the early stages. This type of robot needs to learn and interpret its environment while remaining accurate and safe for the persons interacting with the robot.

One of the interesting intermediate outcome from the project is that a human-like robot with legs and arms is far too expensive and complicated to handle. Excellent for science fiction movies, but in reality too difficult to control its balance and movements.

This was an issue with the iCUB robot. Now Giorgio and the teams are working on the new R1 robot, maybe not “as-human” as the iCUB robot, but more affordable. It is not only the mechanics that challenge the researchers. Also, the software supporting the artificial intelligence required for a self-learning and performing safe robot is still in the early days.

clip_image004

An inspiring keynote speech to start the conference.

Standardizing PLM Components

The first Extended PLM session was Guido Klette (Rheinmetall), describing the challenges the Rheinmetall group has related to develop and support PLM needs. The group has several PLD/PLM-like systems in place. Guido does not believe in one size fits all to help every business in the group. They have already several PLM “monsters” in their organization. For more adequate support, Rheinmetall has defined a framework with PLM components and dependencies to a more granular choice of functionality to meet individual businesses.

Rheinmetal components

A challenge for this approach, identified by a question from the audience, is that it is a very scientific approach not addressing the difference in culture between countries. Guido agreed and mentioned that despite culture, companies joining the Rheinmetall group most of the time were happy to adhere to such a structured approach.

My takeaway: the component approach fits very well with the modern thinking that PLM should not be supported by a single “monster” system but can be addressed by components providing at the end the right business process support.

PLM as a business asset

husqvarnagroupBjörn Axling gave an excellent presentation describing the PLM perspective from the Husqvarna group. He addressed the external and internal challenges and opportunities for the group in a structured and logical approach which probably apply for most manufacturing companies in a global market. Björn explained that in the Husqvarna group PLM is considered as a business approach, more than ever, Product Lifecycle Management needs to be viewed as the DNA of a company which was the title of one of his slides.

Husqvarna

I like his eleven key imperatives (see the above picture) in particular key imperative #9 which is often forgotten:

Take definitions, nomenclature and data management very seriously – the devil is in the details.

This point will always fire back on you if you did not give it the needed attention from the start. Of course, the other ten points are also relevant. The challenge in every PLM project is to get these points addressed and understood in your company.

How to use PLM to enable Industry 4.0?

EignerMartin Eigner´s presentation was building upon his consistent messages that PDM and PLM should be evolving into SysML with a growing need for Model-Based Systems Engineering (MBSE) support.

The title of the presentation was related to Industry 4.0 more focusing on innovation in for Germany´s manufacturing industry. Germany has always been strong in manufacturing, not so strong in product innovation. Martin mentioned that later this year the German government will start another initiative, Engineering 4.0, which should be exciting for our PLM community.

Martin elaborated on the fact that end-to-end support for SysLM can be achieved through a backbone based on linked data. Do not try to solve all product information views in a single system is the lesson learned and preached.

Eigner-Bimodal

For me, it was interesting to see that also Martin picked up on the bimodal approach for PLM, required to support a transition to a modern digital enterprise (see picture). We cannot continue to build upon our old PLM environments to support, future digital businesses.

PLM and Digital Transformation

In my afternoon session (Jos Voskuil), I shared the observations that companies invest a lot in digital transformation downstream by introducing digital platforms for ERP, CRM, MES and Operations. PLM is often the forgotten platform that needs to change to support a digital enterprise with all its benefits. You can see my presentation here on SlideShare. I addressed here the bimodal approach as discussed in a previous blog post, introduced in Best Practices or Next Practices.

TacitBerlin2017Conclusions

In case your company is not ready yet for a digital transformation or bimodal approach I addressed the need to become model-driven instead of document-driven. And of course for a digital enterprise, the quality of the data counts. I wrote about these topics recently: Digital PLM requires a Model-Based Enterprise and The importance of accurate data: ACT NOW!

Closed-Loop PLM

The last extended PLM presentation from day 1 was given by Felix Nyffenegger, professor for PLM/CAx at HSR (University of Applied Science in Rapperswil (CH)). Felix shared his discovery journey into Industry 4.0, and IoT combined with experiences from the digitalLab@HSR, leading into the concept of closed-loop PLM.

ClosedLoop

I liked in particular how Felix brought the various views on the product together into one diagram, telling the full story of closed-loop PLM – necessary for a modern implementation framework.

A new age for airships

The last presentation of the day was from Chris Daniels describing the journey of Hybrid Air Vehicles with their Airlander 10 project. Where the classical airships, the most infamous perhaps the Hindenburg, have disappeared due to their flaws, the team of Hybrid Air Vehicles built upon the concept of airships in a defense project with the target to deliver a long endurance multi-intelligence vehicle. The advantage of airships is that they can stay in the air for several days, serving as communication hotspot, communication or rescue ship for places hard to reach with traditional aircraft or helicopter. The Airlander can be operation without going back to a base for 5 days, which is extremely long when you compare this to other aircraft.

airlander

The Airlander project is a typical example of incremental innovation used to optimize and extend the purpose of an airship. Combined with the fact that Chris was an excellent speaker made it a great closure of the day

Conclusion

This post is just an extract of one day and one stream of the conference. Already too large for a traditional blog post. Next week I will follow-up with day two and respond beyond 140 characters to the tweet below:

WhyNotInPLM

Translate

Email subscription to this blog

Advertisements
%d bloggers like this: