You are currently browsing the category archive for the ‘PLM’ category.

Perhaps an ambiguous title this time as it can be interpreted in various ways. I think that all these interpretations are one of the most significant problems with PLM. Ambiguity everywhere. Its definition, its value and as you might have noticed from the past two blog posts the required skill-set for PLM consultants.

As I am fine-tuning my presentation for the upcoming PLMx 2018 Event in Hamburg, some things become clearer for me. This is one of the advantages of blogging, speaking at PLM conferences and discussing PLM with companies that are eager to choose to right track for PLM. You are forced to look in more depth to be consistent and need to have arguments to support your opinion about what is happening in the scope of PLM. And from these learnings I realize often that the WHY PLM remains a big challenge for various reasons.

Current PLM

In the past twenty years, companies have implemented PLM systems, where the primary focus was on the P (Product) only from Product Lifecycle Management. PLM systems have been implemented as an engineering tool, as an evolution of (Product Data Management).

PLM systems have never been designed from the start as an enterprise system. Their core capabilities are related to engineering processes and for that reason that is why most implementations start with engineering.  Later more data-driven PLM-systems like Aras and Autodesk have begun from another angle, data connectivity between different disciplines as a foundation, avoiding to get involved with the difficulty of engineering first.

This week I saw the publication of the PLMPulse survey results by i42R / MarketKey where they claim:

The results from first industry-led survey on our status of Product Lifecycle Management and future priorities

The PLMPulse report is based on five different surveys as shown in the image above. Understanding the various aspects of PLM from usage, business value, organizational constraints, information value and future potential. More than 350 people from all around the world answered the various questions related to these survey.  Unfortunate inputs from some Asian companies are missing. We are all curious what happens in China as there, companies do not struggle with the same legacy related to PLM as other countries. Are they more embracing PLM in a different way?

The results as the editors also confirm, are not shocking and confirming that PLM has the challenge to get out of the engineering domain. Still, I recommend downloading the survey as it has interesting details. After registration you can download the report from here.

What’s next

During the upcoming PLMx 2018 Hamburg conference there will be a panel discussion where the survey results will be discussed. I am afraid that this debate will result again in a discussion where we will talk about the beauty and necessity of PLM and we wonder why PLM is not considered crucial for the enterprise.

There are a few challenges I see for PLM and hopefully they will be addressed. Most discussions are about WHAT PLM should/could do and not WHY.  If you want to get to the WHY of PLM, you need to be able to connect the value of PLM to business outcomes that resonate at C-level. Often PLM implementations are considered costly and ROI and business value are vague.

As the PLMPulse report also states, the ROI for PLM is most of the time based on efficiency and cost benefits related to the current way of working. These benefits usually do not offer significant ROI numbers. Major benefits come for working in a different way and focusing on working closer to your customer. Business value is hard to measure.

How do you measure the value of multidisciplinary collaboration or being more customer-centric? What is the value of being better connected to your customer and being able to react faster? These situations are hard to prove at the board level, as here people like to see numbers, not business transformations.

Focus on the WHY and HOW

A lot of the PLM messages that you can read through various marketing or social channels are related to futuristic concepts and high-level dreams that will come true in the next 10-20 years. Most companies however have a planning horizon of 2 years max 5 years. Peter Bilello from CIMdata presented one of their survey results at the PDT conference in 2014, shown below:

Technology and vision are way ahead of reality. Even the area where the leaders focusing the distance between technology and vision gets bigger. The PLM focus is more down-to-earth and should not be on what we are able to do, but the focus should be on what would be the next logical step for our company to progress to the future.

System of Record and System of Engagement

At the PLMx conference I will share my experiences related to PLM transformations with the audience. One and a half-year ago we started talking about the bi-modal approach. Now more and more I see companies adopting the concepts of bi-modal related to PLM.  Still most organizations struggle with the fact that their PLM should be related to one PLM system or one PLM vendor, where I believe we should come to the conclusion that there are two PLM modes at this moment. And this does not imply there need to be only one or two systems – it will become a federated infrastructure.

Current modes could be an existing PLM backbone, focusing on capturing engineering data, the classical PLM system serving as a system of record. And a second, new growing PLM-related infrastructure which will be a digital, most likely federated, platform where modern customer-centric PLM processes will run. As the digital platform will provide real-time interaction it might be considered as a system of engagement, complementary to the system of record.

It will be the system of engagement that should excite the board members as here new ways of working can be introduced and mastered. As there are no precise blueprints for this approach, this is the domain where innovative thinking needs to take place.

That’s why I hope that neutral PLM conferences will less focus on WHAT can be done. Discussions like MBSE, Digital Thread, Digital Twin, Virtual Reality / Augmented Reality are all beautiful to watch. However, let’s focus first on WHY and HOW. For me besides the PLMx Hamburg conference, other upcoming events like PDT 2018 (this time in the US and Europe) are interesting events and currently PDT the call for papers is open and hopefully we  find speakers that can teach and inspire.

CIMdata together with Eurostep are organizing these events in May (US) and October (Europe). The theme for the CIMdata roadmap conference will be “Charting the Course to PLM Value together – Expanding the value footprint of PLM and Tackling PLM’s Persistent Pain Points” where PDT will focus on Collaboration in the Engineering Supply Chain – the extended digital thread.  These themes need to be addressed first before jumping into the future. Looking forward to meeting you there.

 

Conclusions

In the world of PLM, we are most of the time busy with explaining WHAT we (can/will) do. Like a cult group sometimes we do not understand why others do not see the value or beauty of our PLM concepts. PLM dialogues and conferences should therefore focus more on WHY and HOW. Don’t worry, the PLM vendors/implementers will always help you with WHAT they can do and WHY it is different.

 

Advertisements

If you have followed my blog over the past 10 years, I hope you realize that I am always trying to bring sense to the nonsense and still looking into the future where new opportunities are imagined. Perhaps due to my Dutch background (our motto: try to be normal – do not stand out) and the influence of working with Israeli’s (a country where almost everyone is a startup).

Given this background, I enjoy the current discussion with Oleg Shilovitsky related to potential PLM disruptions. We worked for many years together at SmarTeam, a PDM/PLM disruptor at that time, in the previous century. Oleg has continued his passion for introducing potential disruptive solutions  (Inforbix / OpenBOM) where I got more and more intrigued by human behavior related to PLM. For that reason, I have the human brain in my logo.

Recently we started our “The death of ….” Dialogue, with the following episodes:

Jan 14thHow to democratize PLM knowledge and disrupt traditional consulting experience

Jan 21stThe death of PLM Consultancy

Jan 22ndWhy PLM consultants are questioning new tools and asking about cloud exit strategy?

Here is episode 4  – PLM Consultants are still alive and have an exit strategy

Where we agree

We agreed on the fact that traditional consultancy practices related to PLM ranking and selection processes are out of time. The Forester Wave publication was the cause of our discussion. For two reasons:

  1. All major PLM systems cover for 80 percent the same functionalities. Therefore there is no need to build, send and evaluate lengthy requirements lists to all potential candidates and then recommend on the preferred vendor. Waste of time as the besides the requirements there is much more to evaluate than just performing tool selection.
  2. Many major consultancy firms have PLM practices, most of the time related to the major PLM providers. Selecting one of the major vendors is usually not a problem for your reputation, therefore the importance of these rankings. Consultancy firms will almost never recommend disruptive tool-sets.

PLM businesses transformation

At this point, we are communicating at a different wavelength. Oleg talks about PLM business transformation as follows:

Cloud is transforming PLM business. Large on-premise PLM projects require large capital budget. It is a very good foundation for existing PLM consulting business. SaaS subscription is a new business model and it can be disruptive for lucrative consulting deals. Usually, you can see a lot of resistance when somebody is disrupting your business models. We’ve seen it in many places and industries. It happened with advertising, telecom and transportation. The time is coming to change PLM, engineering and manufacturing software and business.

I consider new business models less relevant compared to the need for a PLM practice transformation. Tools like Dropbox, perhaps disruptive for PDM systems, are tools that implement previous century methodology (document-driven / file-based models). We are moving from item-centric towards a model-driven future.

The current level of PLM practices is related to an item-centric approach, the domain where also OpenBOM is bringing disruption.
The future, however, is about managing complex products, where products are actually systems, a combination of hardware and software. Hardware and software have a complete different lifecycle, and all major PLM vendors are discovering an overall solution concept to incorporate both hardware and software. If you cannot manage software in the context of hardware in the future, you are at risk.  Each PLM vendor has a different focus area due to their technology history. I will address this topic during the upcoming PLMx conference in Hamburg. For a model-driven enterprise, I do not see an existing working combination of disruptors yet.

Cloud security and Cloud exit strategy

Oleg does not really see the impact of the cloud as related to the potential death of PLM consulting as you can read here:

I agree, cloud might be still not for everyone. But the adoption of cloud is growing and it is becoming a viable business model and technology for many companies. I wonder how “cloud” problem is related to the discussion about the death of PLM consulting. And…  here is my take on this. It is all about business model transformation.

I am not convinced that in the PLM cloud is the only viable business model. Imagine an on-premise rigid PLM system. Part of the cloud-based implementation benefits come from low upfront costs and scalable IT. However, cloud also pushes companies to defend a no-customization strategy – configuration of the user interface only.  This is a “secret” benefit for cloud PLM vendors as they can say “NO” to the end users of course within given usability constraints. Saying “NO” to the customer is lesson one for every current PLM implementation as everyone knows the problem of costly upgrades later

Also, make a 5-10 years cost evaluation of your solution and take the risk of raising subscription fees into account. No vendor will drop the price unless forced by the outside world. The initial benefits will be paid back later because of the other business model.

Cloud exit strategy and standards

When you make a PLM assessment, and usually experienced PLM consultants do this, there is a need to consider an exit strategy. What happens if your current PLM cloud vendor(s) stops to exist or migrate to a new generation of technology and data-modeling? Every time when new technology was introduced, we thought it was going to be THE future. The future is unpredictable. However, I can predict that in 10 years from now we live with different PLM concepts.

There will be changes and migrations and cloud PLM vendors will never promote standardized exports methods (unless forced) to liberate the data in the system. Export tools could be a niche market for PLM partners, who understand data standards. Håkan Kårdén, no finders fee required, however, Eurostep has the experience in-house.

 

Free downloads – low barriers to start

A significant difference in opinion between Oleg and me is Oleg’s belief in bottom-up, DIY PLM as part of PLM democratization and my belief in top-down business transformation supported by PLM. When talking about Aras, Autodesk, and OpenBOM,  Oleg states:

All these tools have one thing in common. You can get the tool or cloud services for free and try it by yourself before buying. You can do it with Aras Innovator, which can be downloaded for free using enterprise open source. You can subscribe for Autodesk Fusion Lifecycle and OpenBOM for trial and free subscriptions. It is different from traditional on-premise PLM tools provided by big PLM players. These tools require months and sometimes even years of planning and implementation including business consulting and services.

My experience with SmarTeam might influence this discussion. SmarTeam was also a disruptive PDM solution thanks to its easy data-modeling and Microsoft-based customization capabilities like Aras. Customers and implementers could build what they want, you only needed to know Visual Basic. As I have supported the field mitigating installed SmarTeam implementations, often the problem was SmarTeam has been implemented as a system replicating/automating current practices.

Here Henry Ford’s statement as shown below applies:

Implementations became troublesome when SmarTeam provided new and similar business logic. Customers needed to decide to use OOTB features and de-customize or not benefits from new standard capabilities. SmarTeam had an excellent business model for service providers and IT-hobbyists/professionals in companies. Upgrade-able SmarTeam implementations where those that remained close to the core, but meanwhile we were 5 – 8 years further down the line.

I believe we still need consultants to help companies to tell and coach them towards new ways of working related to the current digitization. Twenty years old concepts won’t work anymore. Consultants need a digital mindset and think holistic. Fitting technology and tools will be there in the future.

Conclusion

The discussion is not over, and as I reached already more than 1000 words, I will stop. Too many words already for a modern pitch, not enough for a balanced debate. Oleg and I will continue in Hamburg, and we both hope others will chime in, providing balanced insights in this discussion.

To be continued …..?

 

In my earlier post; PLM 2018 my focus, your input, I invited you to send PLM related questions that would spark of a dialogue. As by coincidence Oleg Shilovitsky wrote a post with the catchy title: Why traditional PLM ranking is dead. PLM ranking 2.0. Read this post and the comments if you want to follow this dialogue.

Oleg reacts in this post on the discussion that had started around the Forester Wave ranking PLM Vendors, which on its own is a challenging topic. I know from my experience that these rankings depend very much on a mix of functions and features, but also are profoundly influenced by the slideware and marketing power of these PLM Vendors. Oleg also quotes Joe Barkai’s post: ranking PLM Vendors to illustrate that this kind of ranking does not bring a lot of value as there is so much commonality between these systems.

I agree with Oleg and Joe. PLM ranking does not make sense for companies to select a PLM solution. They are more an internal PLM show, useful for the organizing consultancy companies to conduct, but at the end, it is a discussion about who has the biggest and most effective button. Companies need to sell themselves and differentiate.

Do we need consultancy?

We started a dialogue on the comments of Oleg’s blog post where I mentioned that PLM is not about selecting a solution from a vendor, there are many other facets related to a PLM implementation. First of all, the industry your company is active in. No solution fits all industries.

But before selecting a solution, you first need to understand what does a company want to achieve in the future. What is the business strategy and how can PLM support this business strategy?

In most cases, a strategy is future-oriented and not about consolidating the current status quo. Therefore I believe a PLM implementation is always done in the context of a business transformation, which is most of the time not only related to PLM – it is about People, Processes and then the tools.

Oleg suggests that this complexity is created by the consulting business, as he writes:

Complex business and product strategies are good for consulting business you do. High level of complexity with high risk of failure for expensive PLM projects is a perfect business environment to sell consulting. First create complexity and then hire consulting people to explain how to organize processes and build business and product strategy. Win-win

Enterprise and engineering IT are hiring consulting to cover their decision process. That was a great point made by Joe Barkai- companies are buying roadmaps and long-term commitments, but rarely technologies. Technologies can be developed, and if even something is missed, you can always acquire independent vendors or technology later – it was done many times by many large ISVs in the past.

Here I agree with a part of the comments. If you hire consultancy firms just for the decision process, it does not make sense/ The decision process needs to be owned by the company. Do not let a consultancy company prescribe your (PLM) strategy as there might be mixed interests. However, when it comes to technologies, they are derived from the people and process needs.

So when I write in the comment:

We will not change the current status quo and ranking processes very soon. Technology is an enabler, but you need a top-down push to work different (at least for those organizations that read vendor rankings).

Oleg states:

However, the favorite part of your comments is this – “We will not change the current status quo and ranking processes very soon.” Who are “we”???? Management consulting people?

With “we” I do not mean the consulting people. In general, the management of companies is more conservative than consultants are. It is our human brain that is change averse and pushes people to stay in a kind of mainstream mode. In that context, the McKinsey article: How biases, politics, and egos derail business decisions is a fascinating read about company dynamics. Also, CIMdata published in the past a slide illustrating the gap between vision, real capabilities and where companies really are aiming at.

There is such a big gap between where companies are and what it possible. Software vendors describe the ideal world but do not have a migration path. One of the uncomfortable discussions is when discussing a cloud solution is not necessary security (topic #1) but what is your exit strategy? Have you ever thought about your data in a cloud solution and the vendor raises prices or does no longer have a viable business model. These are discussions that need to take place too.

Oleg also quotes a CIMdata cloud PLM research how companies are looking for solutions as they are “empowered” by the digital world. Oleg states:

In a digital world, companies are checking websites, technologies, watching YouTube and tried products available online. Recent cloud PLM research published by CIMdata tells that when companies are thinking about cloud PLM, the first check they do is independent software providers recommendations and websites (not business process consultants).

I am wondering the value of this graph. The first choice is independent software recommendations/websites.  Have you ever seen independent software recommendations?

Yes, when it comes to consumer tools. “I like software A because it gives me the freedom what to do” or “Software B has so many features for such a low price – great price/value ratio.”

These are the kind of reviews you find on the internet for consumers. Don’t try to find answers on a vendor website as there you will get no details, only the marketing messages.

I understand that software vendors, including Oleg’s company OpenBOM, needs to differentiate by explaining that the others are too complex. It is the same message you hear from all the relative PLM newcomers, Aras, Autodesk, …….

All these newcomers provide marketing stories and claim successes because of their tools, where reality is the tool is secondary to the success. First, you need the company to have a vision and a culture that matches this tool. Look at an old Gartner picture (the hockey stick projection) when all is aligned. The impact of the tool is minimal.

Conclusion

Despite democratization of information, PLM transformations will still need consultants or a well-educated workforce inside your company. Consultants have the advantage of collected experience, which often is not the case when you work inside a company. We should all agree that at the end it is about the business first (human beings are complex) and then the tools (here you can shop on the internet what matches the vision)

Although this post seems like ping-pong match of arguments, I challenge you to take part of this discussion. Tell us where you agree or disagree combined with argumentation as we should realize the argumentation is the most valuable point.
Your thoughts?

Happy New Year to all of you. A new year comes traditionally with good intentions for the upcoming year.  I would like to share my PLM intentions for this year with you and look forward to your opinion. I shared some of my 2017 thoughts in my earlier post: Time for a Break. This year will I focus on the future of PLM in a digital enterprise, current PLM practices and how to be ready for the future.

Related to these activities I will zoom in on people-related topics, like organizational change, business impact and PLM justification in an enterprise. When it happens during the year, or based on your demands, I will zoom in on architectural stuff and best practices.

The future of PLM

Accenture – Digital PLM

At this moment digital transformation is on the top of the hype curve and the impact varies of course per industry. For sure at the company’s C-level managers will be convinced they have the right vision and the company is on the path to success.

Statements like: “We will be the first digital industrial enterprise” or “We are now a software company” impress the outside world and often investors in the beginning.

 

Combined with investments in customer related software platforms a new digital world is relative fast created facing the outside world.  And small pilots are celebrated as significant successes.

What we do not see is that to show and reap the benefits of digital transformation companies need to do more than create a modern, outside facing infrastructure. We need to be able to connect and improve the internal data flow in an efficient way to stay competitive. Buzzwords like digital thread and digital twin are relevant here.

To my understanding we are still in the early phases of discovering the ideal architecture and practices for a digital enterprise. PLM Vendors and technology companies show us the impressive potential as-if the future already exists already now. Have a reality check from Marc Halpern (Gartner) in this article on engineering.com – Digital Twins: Beware of Naive Faith in Simplicity.

I will focus this year on future PLM combined with reality, hopefully with your support for real cases.

Current PLM practices

Although my curiosity is focused on future PLM, there is still a journey to go for companies that have just started with PLM.  Before even thinking of a digital enterprise, there is first a need to understand and implement PLM as an infrastructure outside the engineering department.

Many existing PLM implementations are actually more (complex) document management systems supporting engineering data, instead of using all available capabilities of a modern PLM systems. Topics like Systems Engineering, multidisciplinary collaboration, Model-Based Enterprise, EBOM-MBOM handling, non-intelligent numbering are all relevant for current and future PLM.

Not exploring and understanding them in your current business will make the gap towards the future even bigger. Therefore, keep on sending your questions and when time allows I will elaborate. For example, see last year’s PLM dialogue – you find these posts here: PLM dialogue and PLM dialogue (continued). Of course I will share my observations in this domain too when I bump into them.

 

To be ready for the future

The most prominent challenge for most companies however is how to transform their existing business towards a modern digital business where new processes and business opportunities need to be implemented inside an existing enterprise. These new processes and business opportunities are not just simple extensions of the current activities, they need new ways of working like delivering incremental results through agile and multidisciplinary teams. And these ways of working combined with never-existing-before interactivity with the market and the customer.

How to convince management that these changes are needed and do not happen without their firm support? It is easier to do nothing and push for small incremental changes. But will this be fast enough? Probably not as you can read from research done by strategic consultancy firms. There is a lot of valuable information available if you invest time in research. But spending time is a challenge for management.

I hope to focus on these challenges too, as all my clients are facing these challenges. Will I be able to help them? I will share successes and pitfalls with you, combined supporting information that might be relevant for others

Your input?

A blog is a modern way of communicating with anyone connected in the world. What I would like to achieve this year is to be more interactive. Share your questions – there are no stupid questions as we are all learning. By sharing and learning we should be able to make achievable steps and become PLM winners.

Best wishes to us all and be a winner not a tweeter …..

 

 

 

For those who have followed my blog over the years, it must be clear that I am advocating for a digital enterprise explaining benefits of a data-driven approach where possible. In the past month an old topic with new insights came to my attention: Yes or No intelligent Part Numbers or do we mean Product Numbers?

 

 

What’s the difference between a Part and a Product?

In a PLM data model, you need to have support for both Parts and Products and there is a significant difference between these two types of business objects. A Product is an object facing the outside world, which can be a company (B2B) or customer (B2C) related. Examples of B2C products are the Apple iPhone 8, the famous IKEA Billy, or my Garmin 810 and my Dell OptiPlex 3050 MFXX8.  Examples of B2B products are the ABB synchronous motor AMZ 2500, the FESTO standard cylinder DSBG.  Products have a name and if there are variants of the product, they also have an additional identifier.

A Part represents a physical object that can be purchased or manufactured. A combination of Parts appears in a BOM. In case these Parts are not yet resolved for manufacturing, this BOM might be the Engineering BOM or a generic Manufacturing BOM. In case the Parts are resolved for a specific manufacturing plant, we talk about the MBOM.

I have discussed the relation between Parts and Products in a earlier post Products, BOMs and Parts which was a follow-up on my LinkedIn post, the importance of a PLM data model. Although both posts were written more than two years ago, the content is still valid. In the upcoming year, I will address this topic of products further, including software and services moving to solutions / experiences.

Intelligent number for Parts?

As parts are company internal business objects, I would like to state if the company is serious about becoming a digital enterprise, parts should have meaningless unique identifiers. Unique identifiers are the link between discipline or application specific data sets. For example, in the image below, where I imagined attributes sets for a part, based on engineering and manufacturing data sets.

Apart from the unique ID, there might be a common set of attributes that will be exposed in every connected system. For example, a description, a classification and one or more status attributes might be needed.

Note 1: A revision number is not needed when you create every time a new unique ID for a new version of the part.  This practice is already common in the electronics industry. In the old mechanical domain, we are used to having revisions in particular for make parts based on Form-Fit-Function rules.

Note 2: The description might be generated automatically based on a concatenation of some key attributes.

Of course if you are aiming for a full digital enterprise, and I think you should, do not waste time fixing the past. In some situations, I learned that an external consultant recommended the company to rename their old meaningful part numbers to the new non-intelligent part numbering scheme. There are two mistakes here. Renumbering is too costly, as all referenced information should be updated. And secondly as long as the old part numbers have a unique ID for the enterprise, there is no need to change. The connectivity of information should not depend on how the unique ID is formatted.

Read more if you want here: The impact of Non-Intelligent Part Numbers

Intelligent numbers for Products?

If the world was 100 % digital and connected, we could work with non-intelligent product numbers. However, this is a stage beyond my current imagination.  For products we will still need a number that allows customers to refer to, for when they communicate with their supplier / vendor or service provider. For many high-tech products the product name and type might be enough. When I talk about the Samsung S5 G900F 16G, the vendor knows which kind of configuration I am referring too. Still it is important to realize that behind these specifications, different MBOMs might exist due to different manufacturing locations or times.

However, when I refer to the IKEA Billy, there are too many options to easily describe the right one consistent in words, therefore you will find a part number on the website, e.g. 002.638.50. This unique ID connects directly to a single sell-able configuration. Here behind this unique ID also different MBOMs might exist for the same reason as for the Samsung telephone. The number is a connection to the sales configuration and should not be too complicated as people need to be able to read and recognize it when you go to a warehouse.

Conclusion

There is a big difference between Product and Part numbers because of the intended scope of these business objects. Parts will soon exist in connected, digital enterprises and therefore do not need any meaningful number anymore. Products need to be identified by consumers anywhere around the world, not yet able or willing to have a digital connection with their vendors. Therefore smaller and understandable numbers will remain needed to support exact communication between consumer and vendor.

When I started working with SmarTeam Corp.  in 1999, the company had several product managers, who were responsible for the whole lifecycle of a component or technology. The Product Manager was the person to define the features for the new release and provide the justification for these new features internally inside R&D.  In addition the Product Manager had the external role to visit customers and understand their needs for future releases and building and explaining a coherent vision to the outside and internal world. The product manager had a central role, connecting all stakeholders.

In the ideal situation the Product Manager was THE person who could speak in R&D-language about the implementation of features, could talk with marketing and documentation teams to explain the value and expected behavior and could talk with the customer describing the vision, meanwhile verifying the product’s vision and roadmap based on their inputs.All these expected skills make the role of a product manager challenging. Is the person too “techy” than he/she will enjoy working with R&D but have a hard time understanding customer demands. From the other side if the Product Manager is excellent in picking-up customer and market feedback he/she might not be heard and get the expected priorities from R&D. For me, it has always been clear that in software world a “bi-directional” Product Manager is crucial to success.

Where are the Product Managers in the Manufacturing Industry?

Approximate four years ago new concepts related to digitalization for PLM became more evident. How could a digital continuity connect the various disciplines around the product lifecycle and therefore provide end-to-end visibility and traceability? When speaking of end-to-end visibility most of the time companies talked about the way they designed and delivered products, visibility of what is happening stopped most of the time after manufacturing. The diagram to the left, showing a typical Build To Order organization illustrates the classical way of thinking. There is an R&D team working on Innovation, typically a few engineers and most of the engineers are working in Sales Engineering and Manufacturing Preparation to define and deliver a customer specific order. In theory, once delivered none of the engineers will be further involved, and it is up to the Service Department to react to what is happening in the field.

A classical process in the PLM domain is the New Product Introduction process for companies that deliver products in large volumes to the market, most of the time configurable to be able to answer to various customer or pricing segments. This process is most of the time linear and is either described in one stream or two parallel streams. In the last case, the R&D department develops new concepts and prepares the full product for the market. However, the operational department starts in parallel, initially involved in strategic sourcing, and later scaling-up manufacturing disconnected from R&D.

I described these two processes because they both illustrate how disconnected the source (R&D/ Sales)  are from the final result in the field. In both cases managed by the service department. A typical story that I learned from many manufacturing companies is that at the end it is hard to get a full picture from what is happening across the whole lifecycle, How external feedback (market & customers) have the option to influence at any stage is undefined. I used the diagram below even  before companies were even talking about a customer-driven digital transformation. Just understanding end-to-end what is happening with a product along the lifecycle is already a challenge for a company.

Putting the customer at the center

Modern business is about having customer or market involvement in the whole lifecycle of the product. And as products become more and more a combination of hardware and software, it is the software that allows the manufacturer to provide incremental innovation to their products. However, to innovate in a manner that is matching or even exceeding customer demands, information from the outside world needs to travel as fast as possible through an organization. In case this is done in isolated systems and documents, the journey will be cumbersome and too slow to allow a company to act fast enough. Here digitization comes in, making information directly available as data elements instead of documents with their own file formats and systems to author them. The ultimate dream is a digital enterprise where date “flows”, advocated already by some manufacturing companies for several years.

In the previous paragraph I talked about the need to have an infrastructure in place for people in an organization to follow the product along the complete lifecycle, to be able to analyze and improve the customer experience. However, you also need to create a role in the organization for a person to be responsible for combining insights from the market and to lead various disciplines in the organization, R&D, Sales, Services. And this is precisely the role of a Product Manager.

Very common in the world of software development, not yet recognized in manufacturing companies. In case a product manager role exists already in your organization, he/she can tell you how complicated it currently is to get an overall view of the product and which benefits a digital infrastructure would bring for their job. Once the product manager is well-supported and recognized in the organization, the right skill set to prioritize or discover actions/features will make the products more attractive for consumers. Here the company will benefit.

Conclusion

If your company does not have the role of a product manager in place, your business is probably not yet well enough engaged in the customer journey.  There will be broken links and costly processes to get a fast response to the market.  Consider the role of a Product Manager, which will emerge as seen from the software business.

NOTE 1: Just before publishing this post I read an interesting post from Jan Bosch: Structure Eats Strategy. Well fitting in this context

NOTE 2: The existence of a Product Manager might be a digital maturity indicator for a company, like for classical PLM maturity, the handling of the MBOM (PDM/PLM/ERP) gives insight into PLM maturity of a company.

Related to the MBOM, please read: The Importance of a PLM data model – EBOM and MBOM

 

 

 

 

 

This post is a rewrite of an article I wrote on LinkedIn two years ago and modified it to my current understanding. When you are following my blog, in particular, the posts related to the business change needed to transform a company towards a data-driven digital enterprise, one of the characteristics of digital is about the real-time availability of information. This has an impact on everyone working in such an organization. My conversations are in the context of PLM (Product Lifecycle Management) however I assume my observations are valid for other domains too.

Real-time visibility is going to be the big differentiator for future businesses, and in particular, in the PLM domain, this requires a change from document-centric processes towards data-driven processes.

Documents have a lot of disadvantages.  Documents lock information in a particular format and document handling results in sequential processes, where one person/one discipline at the time is modifying or adding content. I described the potential change in my blog post: From a linear world to fast and circular?

From a linear world to fast and circular

In that post, I described that a more agile and iterative approach to bring products and new enhancements to the market should have an impact on current organizations. A linear organization, where products are pushed to the market, from concept to delivery, is based on working in silos and will be too slow to compete against future, modern digital enterprises. This because departmental structures with their own hierarchy block fast moving of information, and often these silos perform filtering/deformation of the information.  It becomes hard to have a single version of the truth as every department, and its management will push for their measured truth.

A matching business model related to the digital enterprise is a matrix business model, where multi-disciplinary teams work together to achieve their mission. An approach that is known in the software industry, where parallel and iterative work is crucial to continuous deliver incremental benefits.

Image:  21stcenturypublicservant.wordpress.com/

In a few of my projects, I discovered this correlation with software methodology that I wanted to share. One of my clients was in the middle of moving from a document-centric approach toward a digital information backbone, connecting the RFQ phase and conceptual BOM through design, manufacturing definition, and production. The target was to have end-to-end data continuity as much as possible, meanwhile connecting the quality and project tasks combined with issues to this backbone.

The result was that each individual had a direct view of their current activities, which could be a significant quantity for some people engaged in multiple projects.  Just being able to measure these numbers already lead to more insight into an individual’s workload. At the time we discussed with the implementation team the conceptual dashboard for an individual, it lead to questions like: “Can the PLM system escalate tasks and issues to the relevant manager when needed?” and  “Can this escalation be done automatically? “

And here we started the discussion. “Why do you want to escalate to a manager?”  Escalation will only give more disruption and stress for the persons involved. Isn´t the person qualified enough to make a decision what is important?

One of the conclusions of the discussion was that currently, due to lack of visibility of what needs to be done and when and with which urgency, people accept things get overlooked. So the burning issues get most of the attention and the manager’s role is to make things burning to get it done.

When discussing further, it was clear that thanks to the visibility of data, real critical issues will appear at the top of an individual’s dashboard. The relevant person can immediately overlook what can be achieved and if not, take action. Of course, there is the opportunity to work on the easy tasks only and to ignore the tough ones (human behavior) however the dashboard reveals everything that needs to be done – visibility. Therefore if a person learns to manage their priorities, there is no need for a manager to push anymore, saving time and stress.

The ultimate conclusion of our discussion was: Implementing a modern PLM environment brings first of all almost 100 % visibility, the single version of the truth. This new capability breaks down silos, a department cannot hide activities behind their departmental wall anymore. Digital PLM allows horizontal multidisciplinary collaboration without the need going through the management hierarchy.

It would mean Power to People, in case they are stimulated to do so. And this was the message to the management: “ you have to change too, empower your people.”

What do you think – will this happen? This was my question in 2015.  Now two years later I can say some companies have seen the potential of the future and are changing their culture to empower their employees working in multidisciplinary teams. Other companies, most of the time with a long history in business, are keeping their organizational structure with levels of middle management and maintain a culture that consolidates the past.

Conclusion

A digital enterprise empowers individuals allowing companies to become more proactive and agile instead of working within optimized silos. In silos, it appears that middle management does not trust individuals to prioritize their work.  The culture of a company and its ability to change are crucial for the empowerment of individuals The last two years there is progress in understanding the value of empowered multidisciplinary teams.

Is your company already empowering people ? Let us know !

Note: After speaking with Simon, one of my readers who always gives feedback from reality, we agreed that multidisciplinary teams are very helpful for organizations. However you will still need a layer of strategic people securing standard ways of working and future ways of working as the project teams might be to busy doing their job. We agreed this is the role for modern middle management.
DO YOU AGREE ?

Last week I posted my first review of the PDT Europe conference. You can read the details here: The weekend after PDT Europe (part 1).  There were some questions related to the abbreviation PDT. Understanding the history of PDT, you will discover it stands for Product Data Technology. Yes, there are many TLA’s in this world.

Microsoft’s view on the digital twin

Now back to the conference. Day 2 started with a remote session from Simon Floyd. Simon is Microsoft’s Managing Director for Manufacturing Industry Architecture Enterprise Services and a frequent speaker at PDT. Simon shared with us Microsoft’s viewpoint of a Digital Twin, the strategy to implement a Digit Twin, the maturity status of several of their reference customers and areas these companies are focusing. From these customers it was clear most companies focused on retrieving data in relation to maintenance, providing analytics and historical data. Futuristic scenarios like using the digital twin for augmented reality or design validation. As I discussed in the earlier post, this relates to my observations, where creating a digital thread between products in operations is considered as a quick win. Establishing an end-to-end relationship between products in operation and their design requires many steps to fix. Read my post: Why PLM is the forgotten domain in digital transformation.

When discussing the digital twin architecture, Simon made a particular point for standards required to connect the results of products in the field. Connecting a digital twin in a vendor-specific framework will create a legacy, vendor lock-in, and less open environment to use digital twins. A point that I also raised in my presentation later that day.

Simon concluded with a great example of potential future Artificial Intelligence, where an asset based on its measurements predicts to have a failure before the scheduled maintenance stop and therefore requests to run with a lower performance so it can reach the maintenance stop without disruption.

Closing the lifecycle loop

Sustainability and the circular economy has been a theme at PDT for some years now too. In his keynote speech, Torbjörn Holm from Eurostep took us through the global megatrends (Hay group 2030) and the technology trends (Gartner 2018) and mapped out that technology would be a good enabler to discuss several of the global trends.

Next Torbjörn took us through the reasons and possibilities (methodologies and tools) for product lifecycle circularity developed through the ResCoM project in which Eurostep participated.

The ResCoM project (Resource Conservative Manufacturing) was a project co-funded by the European Commission and recently concluded. More info at www.rescom.eu

Torbjörn concluded discussing the necessary framework for Digital Twin and Digital Thread(s), which should be based on a Model-Based Definition, where ISO 10303 is the best candidate.

Later in the afternoon, there were three sessions in a separate track, related to design optimization for value, circular and re-used followed by a panel discussion. Unfortunate I participated in another track, so I have to digest the provided materials still. Speakers in that track were Ola Isaksson (Chalmers University), Ingrid de Pauw & Bram van der Grinten (IDEAL&CO) and Michael Lieder (KTH Sweden)

Connecting many stakeholders

Rebecca Ihrfors, CIO from the Swedish Defense Material Administration (FMV) shared her plans on transforming the IT landscape to harmonize the current existing environments and to become a broker between industry and the armed forces (FM). As now many of the assets come with their own data sets and PDM/PLM environments, the overhead to keep up all these proprietary environments is too expensive and fragmented. FWM wants to harmonize the data they retrieve from industry and the way they offer it to the armed forces in a secure way. There is a need for standards and interoperability.

The positive point from this presentation was that several companies in the audience and delivering products to Swedish Defense could start to share and adapt their viewpoints how they could contribute.

Later in the afternoon, there were three sessions in a separate track rented to standards for MBE inter-operability and openness that would fit very well in this context. Brian King (Koneksys), Adrian Murton (Airbus UK) and Magnus Färneland (Eurostep) provided various inputs, and as I did not attend these parallel sessions I will dive deeper in their presentations at a later time

PLM something has to change – bimodal and more

In my presentation, which you can download from SlideShare here: PLM – something has to change. My main points were related to the fact that apparently, companies seem to understand that something needs to happen to benefit really from a digital enterprise. The rigidness from large enterprise and their inhibitors to transform are more related to human and incompatibility issues with the future.

How to deal with this incompatibility was also the theme for Martin Eigner’s presentation (System Lifecycle Management as a bimodal IT approach) and Marc Halpern’s closing presentation (Navigating the Journey to Next Generation PLM).

Martin Eigner’s consistent story was about creating an extra layer on top of the existing (Mode 1) systems and infrastructure, which he illustrated by a concept developed based on Aras.

By providing a new digital layer on top of the existing enterprise, companies can start evolving to a modern environment, where, in the long-term, old Mode 1 systems will be replaced by new digital platforms (Mode 2). Oleg Shilovitsky wrote an excellent summary of this approach. Read it here: Aras PLM  platform “overlay” strategy explained.

Marc Halpern closed the conference describing his view on how companies could navigate to the Next Generation PLM by explaining in more detail what the Gartner bimodal approach implies. Marc’s story was woven around four principles.

Principle 1 The bimodal strategy as the image shows.

Principle 2 was about Mode 1 thinking in an evolutionary model. Every company has to go through maturity states in their organization, starting from ad-hoc, departmental, enterprise-based to harmonizing and fully digital integrated. These maturity steps also have to be taken into account when planning future steps.

Principle 3 was about organizational change management, a topic often neglected or underestimated by product vendors or service providers as it relates to a company culture, not easy to change and navigate in a particular direction.

Finally, Principle 4 was about Mode 2 activities. Here an organization should pilot (in a separate environment), certify (make sure it is a realistic future), adopt (integrate it in your business) and scale (enable this new approach to exists and grow for the future).

Conclusions

This post concludes my overview of PDT Europe 2017. Looking back there was a quiet aligned view of where we are all heading with PLM and related topics. There is the hype an there is reality, and I believe this conference was about reality, giving good feedback to all the attendees what is really happening and understood in the field. And of course, there is the human factor, which is hard to influence.

Share your experiences and best practices related to moving to the next generation of PLM (digital PLM ?) !

 

 

 

PDT Europe is over, and it was this year a surprising aligned conference, showing that ideas and concepts align more and more for modern PLM. Håkan Kårdén opened the conference mentioning the event was fully booked, about 160 attendees from over 19 countries. With a typical attendance of approx. 120 participants, this showed the theme of the conference: Continuous Transformation of PLM to support the Lifecycle Model-Based Enterprise was very attractive and real. You can find a history of tweets following the hashtag #pdte17

Setting the scene

Peter Bilello from CIMdata kicked-off by bringing some structure related to the various Model-Based areas and Digital Thread. Peter started by mentioning that technology is the least important issue as organization culture, changing processing and adapting people skills are more critical factors for a successful adoption of modern PLM. Something that would repeatedly be confirmed by other speakers during the conference.

Peter presented a nice slide bringing the Model-Based terminology together on one page. Next, Peter took us through various digital threads in the different stages of the product lifecycle. Peter concluded with the message that we are still in a learning process redefining optimal processes for PLM, using Model-Based approaches and Digital Threads and thanks (or due) to digitalization these changes will be rapid. Ending with an overall conclusion that we should keep in mind:


It isn’t about what we call digitalization; It is about delivering value to customers and all other stakeholders of the enterprise

Next Marc Halpern busted the Myth of Digital Twins (according to his session title) and looked into realistic planning them. I am not sure if Marc smashed some of the myths although it is sure Digital Twin is at the top of the hype cycle and we are all starting to look for practical implementations. A digital twin can have many appearances and depends on its usage. For sure it is not just a 3D Virtual model.

There are still many areas to consider when implementing a digital twin for your products. Depending on what and how you apply the connection between the virtual and the physical model, you have to consider where your vendor really is in maturity and avoid lock in on his approach. In particular, in these early stages, you are not sure which technology will last longer, and data ownership and confidentially will play an important role. And opposite to quick wins make sure your digital twin is open and use as much as possible open standards to stay open for the future, which also means keep aiming for working with multiple vendors.

Industry sessions

Next, we had industry-focused sessions related to a lifecycle Model-Based enterprise and later in the afternoon a session from Outotec with the title: Managing Installed Base to Unlock Service opportunities.

The first presentation from Väino Tarandi, professor in IT in Construction at KTH Sweden presented his findings related to BIM and GIS in the context of the lifecycle, a test bed where PLCS meets IFC. Interesting as I have been involved in BIM Level 3 discussions in the UK, which was already an operational challenge for stakeholders in the construction industry now extended with the concept of the lifecycle. So far these projects are at the academic level, and I am still waiting for companies to push and discover the full benefits of an integrated approach.

Concepts for the industrial approach could be learned from Outotec as you might understand later in this post. Of course the difference is that Outotec is aiming for data ownership along the lifecycle, where in case of the construction industries, each silo often is handled by a different contractor.

Fredrik Ekström from Swedish Transport Administration shared his challenges of managing assets for both road and railway transport – see image on the left. I have worked around this domain in the Netherlands, where asset management for infrastructure and asset management for the rail infrastructure are managed in two different organizations. I believe Fredrik (and similar organizations) could learn from the concepts in other industries. Again Outotec’s example is also about having relevant information to increase service capabilities, where the Swedish Transport Administration is aiming to have the right data for their services. When you look at the challenges reported by Fredrik, I assume he can find the answers in other industry concepts.

Outotec’s presentation related to managing installed base and unlock service opportunities explained by Sami Grönstrand and Helena Guiterrez was besides entertaining easy to digest content and well-paced. Without being academic, they explained somehow the challenges of a company with existing systems in place moving towards concepts of a digital twin and the related data management and quality issues. Their practical example illustrated that if you have a clear target, understanding better a customer specific environment to sell better services, can be achieved by rational thinking and doing, a typical Finish approach. This all including the “bi-modal approach” and people change management.

Future Automotive

Ivar Hammarstadt, Senior Analyst Technology Intelligence for Volvo Cars Corporation entertained us with a projection toward the future based on 160 years of automotive industry. Interesting as electrical did not seem to be the only way to go for a sustainable future depending on operational performance demands.

 

Next Jeanette Nilsson and Daniel Adin from Volvo Group Truck shared their findings related to an evaluation project for more than one year where they evaluated the major PLM Vendors (Dassault Systemes / PTC / Siemens) on their Out-of-the-box capabilities related to 3D product documentation and manufacturing.

They concluded that none of the vendors were able to support the full Volvo Truck complexity in a OOTB matter. Also, it was a good awareness project for Volvo Trucks organization to understand that a common system for 3D geometry reduces the need for data transfers and manual data validation. Cross-functional iterations can start earlier, and more iterations can be performed. This will support a shortening of lead time and improve product quality. Personally, I believe this was a rather expensive approach to create awareness for such a conclusion, pushing PLM vendors in a competitive pre-sales position for so much detail.

Future Aerospace

Kenny Swope from Boeing talked us through the potential Boeing journey towards a Model-Based Enterprise. Boeing has always been challenging themselves and their partners to deliver environments close to what is possible. Look at the Boeing journey and you can see that already in 2005 they were aiming for an approach that most of current manufacturing enterprises cannot meet. And now they are planning their future state.

To approach the future state Boeing aims to align their business with a single architecture for all aspects of the company. Starting with collecting capabilities (over 400 in 6 levels) and defining value streams (strategic/operational) the next step is mapping the capabilities to the value streams.  Part of the process would be to look at the components of a value stream if they could be fulfilled by a service. In this way you design your business for a service-oriented architecture, still independent from any system constraints. As Kenny states the aerospace and defense industry has a long history and therefore slow to change as its culture is rooted in the organization. It will be interesting to learn from Kenny next hear how much (mandatory) progress towards a model-based enterprise has been achieved and which values have been confirmed.

Gearing up for day 2

Martin Eigner took us in high-speed mode through his vision and experience working in a bi-modular approach with Aras to support legacy environments and a modern federated layer to support the complexity of a digital enterprise where the system architecture is leading. I will share more details on these concepts in my next post as during day 2 of PDT Europe both Marc Halpern and me were talking related to this topic, and I will combine it in a more extended story.

The last formal presentation for day one was from Nigel Shaw from Eurostep Ltd where he took us through the journey of challenges for a model-based enterprise. As there will not be a single model that defines all, it will be clear various models and derived models will exist for a product/system.  Interesting was Nigel’s slide showing the multiple models disciplines can have from an airplane (1948). Similar to the famous “swing” cartoon, used to illustrate that every single view can be entirely different from the purpose of the product.

Next are these models consistent and still describing the same initial specified system. On top of that, even the usage of various modeling techniques and tools will lead to differences in the system. And the last challenge on top is managing the change over the system’s lifecycle. From here Nigel stepped into the need for digital threads to govern relations between the various views per discipline and lifecycle stage, not only for the physical and the virtual twin.  When comparing the needs of a model-based enterprise through its lifecycle, Nigel concluded that using PLCS as a framework provides an excellent fit to manage such complexity.

Finally, after a panel discussion, which was more a collection of opinions as the target was not necessary to align in such a short time, it was time for the PDT dinner always an excellent way to share thoughts and verify them with your peers.

Conclusion

Day 1 was over before you knew it without any moment of boredom and so I hope is also this post. Next week I will close reviewing the PDT conference with some more details about my favorite topics.

 

As I am preparing my presentation for the upcoming PDT Europe 2017 conference in Gothenburg, I was reading relevant experiences to a data-driven approach. During PDT Europe conference we will share and discuss the continuous transformation of PLM to support the Lifecycle Model-Based Enterprise. 

One of the direct benefits is that a model-based enterprise allows information to be shared without the need to have documents to be converted to a particular format, therefore saving costs for resources and bringing unprecedented speed for information availability, like what we are used having in a modern digital society.

For me, a modern digital enterprise relies on data coming from different platforms/systems and the data needs to be managed in such a manner that it can serve as a foundation for any type of app based on federated data.

This statement implies some constraints. It means that data coming from various platforms or systems must be accessible through APIs / Microservices or interfaces in an almost real-time manner. See my post Microservices, APIs, Platforms and PLM Services. Also, the data needs to be reliable and understandable for machine interpretation. Understandable data can lead to insights and predictive analysis. Reliable and understandable data allows algorithms to execute on the data.

Classical ECO/ECR processes can become highly automated when the data is reliable, and the company’s strategy is captured in rules. In a data-driven environment, there will be much more granular data that requires some kind of approval status. We cannot do this manually anymore as it would kill the company, too expensive and too slow. Therefore, the need for algorithms.

What is understandable data?

I tried to avoid as long as possible academic language, but now we have to be more precise as we enter the domain of master data management. I was triggered by this recent post from Gartner: Gartner Reveals the 2017 Hype Cycle for Data Management. There are many topics in the hype cycle, and it was interesting to see Master Data Management is starting to be taken seriously after going through inflated expectations and disillusionment.

This was interesting as two years ago we had a one-day workshop preceding PDT Europe 2015, focusing on Master Data Management in the context of PLM. The attendees at that workshop coming from various companies agreed that there was no real MDM for the engineering/manufacturing side of the business. MDM was more or less hijacked by SAP and other ERP-driven organizations.

Looking back, it is clear to me why in the PLM space MDM was not a real topic at that time. We were still too much focusing and are again too much focusing on information stored in files and documents. The only area touched by MDM was the BOM, and Part definitions as these objects also touch the ERP- and After Sales-  domain.

Actually, there are various MDM concepts, and I found an excellent presentation from Christopher Bradley explaining the different architectures on SlideShare: How to identify the correct Master Data subject areas & tooling for your MDM initiative. In particular, I liked the slide below as it comes close to my experience in the process industry

Here we see two MDM architectures, the one of the left driven from ERP. The one on the right could be based on the ISO-15926 standard as the process industry has worked for over 25 years to define a global exchange standard and data dictionary. The process industry was able to reach such a maturity level due to the need to support assets for many years across the lifecycle and the relatively stable environment. Other sectors are less standardized or so much depending on new concepts that it would be hard to have an industry-specific master.

PLM as an Application Specific Master?

If you would currently start with an MDM initiative in your company and look for providers of MDM solution, you will discover that their values are based on technology capabilities, bringing data together from different enterprise systems in a way the customer thinks it should be organized. More a toolkit approach instead of an industry approach. And in cases, there is an industry approach it is sporadic that this approach is related to manufacturing companies. Remember my observation from 2015: manufacturing companies do not have MDM activities related to engineering/manufacturing because it is too complicated, too diverse, too many documents instead of data.

Now with modern digital PLM, there is a need for MDM to support the full digital enterprise. Therefore, when you combine the previous observations with a recent post on Engineering.com from Tom Gill: PLM Initiatives Take On Master Data Transformation I started to come to a new hypotheses:

For companies with a model-based approach that has no MDM in place, the implementation of their Product Innovation Platform (modern PLM) should be based on the industry-specific data definition for this industry.

Tom Gill explains in his post the business benefits and values of using the PLM as the source for an MDM approach. In particular, in modern PLM environments, the PLM data model is not only based on the BOM.  PLM now encompasses the full lifecycle of a product instead of initially more an engineering view. Modern PLM systems, or as CIMdata calls them Product Innovation Platforms, manage a complex data model, based on a model-driven approach. These entities are used across the whole lifecycle and therefore could be the best start for an industry-specific MDM approach. Now only the industries have to follow….

Once data is able to flow, there will be another discussion: Who is responsible for which attributes. Bjørn Fidjeland from plmPartner recently wrote: Who owns what data when …?  The content of his post is relevant, I only would change the title: Who is responsible for what data when as I believe in a modern digital enterprise there is no ownership anymore – it is about sharing and responsibilities

 

Conclusion

Where MDM in the past did not really focus on engineering data due to the classical document-driven approach, now in modern PLM implementations, the Master Data Model might be based on the industry-specific data elements, managed and controlled coming from the PLM data model

 

Do you follow my thoughts / agree ?

 

 

Translate

Email subscription to this blog

Advertisements
%d bloggers like this: