You are currently browsing the tag archive for the ‘Product lifecycle management’ tag.

I was happy to present and participate at the 3DEXEPRIENCE User Conference held this year in Paris on 14-15 March. The conference was an evolution of the previous ENOVIA User conferences; this time, it was a joint event by both the ENOVIA and the NETVIBES brand.

The conference was, for me, like a reunion. As I have worked for over 25  years in the SmarTeam, ENOVIA and 3DEXPERIENCE eco-system, now meeting people I have worked with and have not seen for over fifteen years.

My presentation: Sustainability Demands Virtualization – and it should happen fast was based on explaining the transformation from a coordinated (document-driven) to a connected (data-driven) enterprise.

There were 100+ attendees at the conference, mainly from Europe, and most of the presentations were coming from customers, where the breakout sessions gave the attendees a chance to dive deeper into the Dassault Systèmes portfolio.

Here are some of my impressions.

 

The power of ENOVIA and NETVIBES

I had a traditional view of the 3DEXPERIENCE platform based on my knowledge of ENOVIA, CATIA and SIMULIA, as many of my engagements were in the domain of MBSE or a model-based approach.

However, at this conference, I discovered the data intelligence side that Dassault Systèmes is bringing with its NETVIBES brand.

Where I would classify the ENOVIA part of the 3DEXPERIENCE platform as a traditional System of Record infrastructure (see Time to Split PLM?).

I discovered that by adding NETVIBES on top of the 3DEXPERIENCE platform and other data sources, the potential scope had changed significantly. See the image below:

As we can see, the ontologies and knowledge graph layer make it possible to make sense of all the indexed data below, including the data from the 3DEXPERIENCE Platform, which provides a modern data-driven layer for its consumers and apps.

The applications on top of this layer, standard or developed, can be considered Systems of Engagement.

My curiosity now: will Dassault Systèmes keep supporting the “old” system of record approach – often based on BOM structures (see also my post: The Rise and Fall of the BOM) combined with the new data-driven environment? In that case, you would have both approaches within one platform.

 

The Virtual Twin versus the Digital Twin

It is interesting to notice that Dassault Systèmes consistently differentiates between the definition of the Virtual Twin and the Digital Twin.

According to the 3DS.com website:

Digital Twins are simply a digital form of an object, a virtual version.

Unlike a digital twin prototype that focuses on one specific object, Virtual Twin Experiences let you visualize, model and simulate the entire environment of a sophisticated experience. As a result, they facilitate sustainable business innovation across the whole product lifecycle.

Understandably, Dassault Systemes makes this differentiation. With the implementation of the Unified Product Structure, they can connect CAD geometry as datasets to other non-CAD datasets, like eBOM and mBOM data.

The Unified Product Structure was not the topic of this event but is worthwhile to notice.

 

REE Automotive

The presentation from Steve Atherton from REE Automotive was interesting because here we saw an example of an automotive startup that decided to go pure for the cloud.

REE Automotive is an Israeli technology company that designs, develops, and produces electric vehicle platforms. Their mission is to provide a modular and scalable electric vehicle platform that can be used by a wide range of industries, including delivery and logistics, passenger cars, and autonomous vehicles.

Steve Atherton is the PLM 3DExperience lead for REE at the Engineering Centre in Coventry in the UK, where they have most designers. REE also has an R&D center in Tel Aviv with offshore support from India and satellite offices in the US

REE decided from the start to implement its PLM backbone in the cloud, a logical choice for such a global spread company.

The cloud was also one of the conference’s central themes, and it was interesting to see that a startup company like REE is pushing for an end-to-end solution based on a cloud solution. So often, you see startups choosing traditional systems as the senior members of the startup to take their (legacy) PLM knowledge to their next company.

The current challenge for REE is implementing the manufacturing processes (EBOM- MBOM) and complying as much as possible with the out-of-the-box best practices to make their cloud implementation future-proof.

 

Groupe Renault

Olivier Mougin, Head of PLM at Groupe RENAULT,  talked about their Renaulution Virtual Twin (RVT) program. Renault has always been a strategic partner of Dassault Systèmes.

 

I remember them as one of the first references for the ENOVIA V6 backbone.

The Renaulution Virtual Twin ambition: from engineering to enterprise platform, is enormous, as you can see below:

Each of the three pillars has transformational aspects beyond traditional ways of working. For each pillar, Olivier explained the business drivers, expected benefits, and why a new approach is needed. I will not go into the details in this post.

However, you can see the transformation from an engineering backbone to an enterprise collaboration platform – The Renaulution!.

Ahmed Lguaouzi, head of marketing at NETVIBES, enforced the extended power of data intelligence on top of an engineering landscape as the target architecture.

Renault’s ambition is enormous – the ultimate dream of digital transformation for a company with a great legacy. The mission will challenge Renault and Dassault Systèmes to implement this vision, which can become a lighthouse for others.

 

3DS PLM Journey at MIELE

An exciting session close to my heart was the digital transformation story from MIELE, explained by André Lietz, head of the IT Products PLM @ Miele. As an old MIELE dishwasher owner, I was curious to learn about their future.

Miele has been a family-owned business since 1899, making high-end domestic and commercial equipment. They are a typical example of the power of German mid-market companies. Moreover, family-owned gives them stability and the opportunity to develop a multi-year transformation roadmap without being distracted by investor demands every few years.

André, with his team, is responsible for developing the value chain inside the product development process (PDP), the operation of nearly 90 IT applications, and the strategic transformation of the overarching PLM Mission 2027+.

As the slide below illustrates, the team is working on four typical transformation drivers:

  • Providing customers with connected, advanced products (increasing R&D complexity)
  • Providing employees with a modern, digital environment (the war for digital talent)
  • Providing sustainable solutions (addressing the whole product lifecycle)
  • Improving internal end-to-end collaboration and information visibility (PLM digital transformation)

André talked about their DELMIA pilot plant/project and its benefits to connect the EBOM and MBOM in the 3DEXPERIENCE platform. From my experience, this is a challenging topic, particularly in German companies, where SAP dominated the BOM for over twenty years.

I am curious to learn more about the progress in the upcoming years. The vision is there; the transformation is significant, but they have the time to succeed! This can be another digital transformation example.

 

 

And more …

Besides some educational sessions by Dassault Systemes (Laurent Bertaud – NETVIBES data science), there were also other interesting customer testimonies from Fernando Petre (IAR80 – Fly Again project), Christian Barlach (ISC Sustainable Construction) and Thelma Bonello (Methode Electronics – end-to-end BOM infrastructure). All sessions helped to get a better understanding about what is possible and what is done in the domain of PLM.

 

Conclusion

I learned a lot during these days, particularly the virtual twin strategy and the related capabilities of data intelligence. As the event was also a reunion for me with many people from my network, I discovered that we all aim at a digital transformation. We have a mission and a vision. The upcoming years will be crucial to implement the mission and realizing the vision. It will be the early adopters like Renault pushing Dassault Systèmes to deliver. I hope to stay tuned. You too?

NOTE: Dassault Systèmes covered some of the expenses associated with my participation in this event but did not in any way influence the content of this post.

 

 

 

 

July and August are the quiet summer months here in Europe when companies slow down to allow people to recharge themselves.

However, the speed and hectic are not the same overall, nor is the recharging time. I will be entering a six-week thinking break, assembling thoughts to explore after the summer break. Here are some topics – and you may note – they are all connected.

The MBOM discussion

Although my German is not as good as my English, I got intrigued by a post from Prof. Dr. Jörg W. Fischer.

He claims there is no meaning to the MBOM  and, therefore, the “expensive” PLM concept of the MBOM has to disappear – read the original post here.

Jörg claims there are three reasons why the MBOM why we should not speak about the MBOM – here are the google translated quotes – and I left out some details to keep a place for the thoughts – not the answer yet:

  1. The MBOM as the structure for deriving the assembly drawings. No BOM! (here, I fully agree)
  2. The structure that comes out as a result when planning the assembly. Again, no BOM. (here, I tend to agree – however, we could extend this structure to an MBOM)
  3. The MBOM as the classic parts list in the ERP, the one with which the MRP run is performed. Is that an MBOM? Until recently, I thought so. But it isn’t. So again, no MBOM. (here, I tend to agree – however, we could extend this structure to an MBOM)

The topic on LinkedIn here initiated an interesting sharing of viewpoints. I am quite aligned with Martin Eigner’s comment. It is a pity that this type of discussion is hidden in a LinkedIn environment and in the German language. It would be great to discuss such a topic at a PLM conference. For example, the CIMdata PLM roadmap conference had several Multiview BOM discussions coming from Aerospace and Defense action groups.

Perhaps comparing these two viewpoints – preferably in English – could lead to a better understanding for all of us. Now communication language and system dependencies might blur the methodology discussion.

Cheryl Peck (CIMdata PLM Roadmap organizer)/ Jörg W. Fischer, are you open to this suggestion? BOM discussions have always been popular.

PLM Roadmap & PDT 2022

The good news is the upcoming PLM Roadmap & PDT 2022 event is scheduled as an in-person event on the 18th and 19th of October in Gothenburg, Sweden. Let’s hope no new corona-variant will destroy this plan. I am confident to be there as the Swedish COVID-19 approach has kept society open as much as possible.

Therefore, I am collecting my topics to discuss and preparing my luggage and presentation to be there.

The theme of the conference: Digital Transformation and PLM – a call for PLM Professionals to redefine and re-position the benefits and value of PLM, is close to my experience.

New PLM paradigms are coming up, while at the same time, we are working on solidifying existing concepts, like the Multiview BOM. The PDT part of the conference always brought interesting sessions related to sustainability and, often, the circular economy.

I am curious to see the final agenda. Hakan Karden already gave us some insights into why it is good to be curious – read it here.

Sustainability

Talking and learning about sustainability at PDT Europe is not a luxury. In particular, we experienced an unforeseen heatwave in western Europe, reminding us that the climate is not slowing down. More the contrary, rapid climate change caused by human influence becomes more and more visible.

Unfortunately, the people that suffer from droughts, bushfires, and famine are not the ones that can be held responsible for these effects. It is a global crisis, and the strongest shoulders must carry the weight to address these issues.

In that context, we had an internal meeting with the PLM Global Green Alliance core team members to plan our activities for the rest of the year.

Besides interviews with PLM vendors and technology solution providers, we want to create opportunities for PGGA members to discuss PLM technology, methodology or change topics of interest, moderated by one of our core team members.

One of our observations is that awareness of the need for a more sustainable society exists. In polls all around the world, the majority of people mention their concerns.

However, where to start? What does matter, and how to influence companies as individuals? We also need to learn what is real and what is greenwashing. Therefore we want to schedule open discussions with PGGA members (are you already a member?) to share knowledge and thoughts about a topic. More about the agenda after the summer break.

Discussions & Podcasts

While I remain open for discussions and those who contacted me with a direct message on LinkedIn will acknowledge there is always a follow-up.

Whenever I have time – most of the time, I target Fridays for ad-hoc discussions – I am happy to schedule a zoom session to learn and discuss a particular topic without obligations. It will be a discussion, not a consult.

During Covid-lockdowns, I learned to appreciate podcasts. While making the daily walk through the same environment, the entertainment came from listening to an interesting podcast.

I learned a lot about history, mysteries, and human behavior. Of course, I was also looking for PLM-related podcasts. Of course, the major vendors found their way to podcasts too. However, I think they are often too slick, only highlighting a vision and not enough discussing what really happens in the field.

Starting a PLM-related podcast, and I want to highlight three of them

The Share PLM podcast, with 11 episodes, started promising in 2020. After a first start, it becomes difficult to deliver continuous new content.

Currently, I am talking with the Share PLM team to see how we can build this continuity and extend the content. There are so many interesting persons in our network that have valuable opinions about PLM to share. More after the summer

The Peer Check podcast from CoLab is not a typical PLM podcast. More a focus on what engineering leaders should know. They started in 2022 and have already published ten episodes. I am in the process of listening to all of them, and I found them very refreshing.

This week I was happy to join Adam Keating, founder of CoLab, in a discussion related to Systems of Record and Systems of Engagement. More new after the summer.

The Change Troubleshooter podcast from Nina Dar, with already 34 episodes, is a podcast not focusing on PLM purely. Although Nina has a background in coaching PLM implementations, her episodes are around A Human Approach to Innovation and Change. You can imagine it is quite aligned with my area of interest.

In particular, Nina and I are having some side discussions about sustainability and (the lack of) human behavior to address climate change. You might hear more from Nina through our PGGA community.

More podcasts?

I am curious to learn if similar podcasts exist to the topics I mentioned in this post. If so, provide a link in the comments. With enough feedback, I will publish a top-ten list this year’s end.

 

Conclusion

In a society that seems to behave as if everything is black and white, to be solved by a tweet, we need people that can build a colorful opinion.  Conferences, discussions and podcasts can help you remain curious and learn. As it must be extremely boring if you know already everything.

Have a great summertime.

 

In my previous posts dedicated to PLM education, I shared my PLM bookshelf, spoke with Peter Bilello from CIMdata about their education program and talked with Helena Gutierrez from SharePLM about their education mission.

In that last post, I promised this post will be dedicated to PLM education before s**t hits the fan. This statement came from my conversation with John Stark when we discussed where proper PLM education starts (before it hits the fan).

John is a well-known author of many books. You might have read my post about his book: Products2019: A project to map and blueprint the flow and management of products across the product lifecycle: Ideation; Definition; Realisation; Support of Use; Retirement and Recycling. A book with a very long title reflecting the complexity of a PLM environment.

John is also a long-time PLM consultant known in the early PLM community for his 2PLM e-zine. The 2PLM e-zine was an information letter he published between 1998 and 2017 before blogging and social interaction, updating everyone in the PLM community with the latest news. You probably were subscribed to this e-zine if you are my age.

So, let’s learn something more from John Stark

John Stark

John, first of all, thanks for this conversation. We have known each other for a long time. First of all, can you briefly introduce yourself and explain where your passion for PLM comes from?

The starting point for my PLM journey was that I was involved in developing a CAD system. But by the 1990s, I had moved on to being a consultant. I worked with companies in different industry sectors, with very different products.

I worked on application and business process issues at different product lifecycle stages – Ideation; Definition; Realization; Support of Use; Retirement and Recycling.

However, there was no name for the field I was working in at that time. So, I decided to call it Product Lifecycle Management and came up with the following definition:
‘PLM is the business activity of managing, in the most effective way, a company’s products all the way across their lifecycles; from the very first idea for a product, all the way through until it is retired and disposed of’.

PLM is the management system for a company’s products. It doesn’t just manage one of its products. It manages all of its parts and products and the product portfolio in an integrated way.’

I put that definition at the beginning of a book, ‘Product Lifecycle Management: Paradigm for 21st Century Product Realization’, published in 2004 and has since become the most cited book about PLM. I included my view of the five phases of the product lifecycle

and created the PLM Grid to show how everything (products, applications, product data, processes, people, etc.) fits together in PLM.

From about 2012, I started giving a blended course, The Basics of PLM, with the PLM Institute in Geneva.

As for the passion, I see PLM as important for Mankind. The planet’s 7 billion inhabitants all rely on products of various types, and the great majority would benefit from faster, easier access to better products. So PLM is a win-win for us all.

That’s interesting. I also had a nice definition picture I used in my early days. x

PI London 2011

and I had my view of the (disconnected) lifecycle.

PI Apparel London 2014

The education journey

John, as you have been active in PLM education for more than twenty years, do you feel that PLM Education and Training has changed.

PLM has only existed for about twenty years. Initially, it was so new that there was just one approach to PLM education and training, but that’s changed a lot.

Now there are specific programs for each of the different types of people interested or involved with PLM. So, for example, now there are specific courses for students, PLM application vendor personnel, PLM Managers, PLM users, PLM system integrators, and so on. Each of these groups has a different need for knowledge and skills, so they need different courses.

Another big change has been in the technologies used to support PLM Education and Training. Twenty years ago, the course was usually a deck of PowerPoint slides and an overhead projector. The students were in the same room as the instructor.

These days, courses are often online and use various educational apps to help course participants learn.

Who should be educated?

Having read several of your books, they are very structured and academic. Therefore, they will never be read by people at the C-level of an organization. Who are you targeting with your books, and why?

Initially, I wasn’t targeting anybody. I was just making my knowledge available. But as time went by, I found that my books were mainly used in further education and ongoing education courses.

So now, I focus on a readership of students in such organizations. For example, I’ve adapted some books to have 15 chapters to fit within a 15-week course.

Students make up a good readership because they want to learn to pass their exams. In addition, and it’s a worldwide market, the books are used in courses in more than twenty countries. Also, these courses are sufficiently long, maybe 150 hours, for the students to learn in-depth about PLM. That’s not possible with the type of very short PLM training courses that many companies provide for their employees.

PLM education

Looking at publicly available PLM education, what do you think we can do better to get PLM out of the framing of an engineering solution and become a point of discussion at the C-level

Even today, PLM is discussed at C-level in some companies. But in general, the answer is to provide more education about PLM. Unfortunately, that will take time, as PLM remains very low profile for most people.

For example, I’m not aware of a university with a Chair of Product Lifecycle Management. But then, PLM is only 20 years old, that’s very young.

It often takes two generations for new approaches and technologies to become widely accepted in the industry.

So another possibility would be for leading vendors of PLM applications to make the courses they offer about PLM available to a wider audience.

A career with PLM?

Educating students is a must, and like you and me, there are a lot of institutions that have specialized PLM courses. However, I also noticed a PLM expert at C-level in an organization is an exception; most of the time, people with a financial background get promoted. So, is PLM bad for your career?

No, people can have a good career in PLM, especially if they keep learning. There are many good master’s courses if they want to learn more outside the PLM area. I’ve seen people with a PLM background become a CIO or a CEO of a company with thousands of employees. And others who start their own companies, for example, PLM consulting or PLM training. And others become PLM Coaches.

PLM and Digital Transformation

A question I ask in every discussion. What is the impact of digital transformation on your area of expertise? In this case, how do you see PLM Education and Training looking in 2042, twenty years in the future?

I don’t see digital transformation really changing the concept of PLM over the next twenty years. In 2042, PLM will still be the business activity of managing a company’s products all the way across their lifecycles.

So, PLM isn’t going to disappear because of digital transformation.

On the other hand, the technologies and techniques of PLM Education and Training are likely to change – just as they have over the last twenty years. And I would expect to see some Chairs of Product Lifecycle Management in universities, with more students learning about PLM. And better PLM training courses available in companies.

I see digital transformation making it possible to have an entire connected lifecycle without a lot of overhead.

Digital Transformation – platforms working together

 Want to learn more?

My default closing question is always about giving the readers pointers to more relevant information. Maybe an overkill looking at your oeuvre as a writer. Still, the question is, where can readers from this blog learn more?

x
Three suggestions:
x

What I learned

By talking with John and learning his opinion, I see the academic approach to define PLM as a more scientific definition,  creating a space for the PLM professional.

We had some Blog /LinkedIn interaction related to PLM:  Should PLM become a Profession? In the past (2017).

When I search on LinkedIn, I find 87.000 persons with the “PLM Consultant” tag. From those, I know in my direct network, I am aware there is a great variety of skills these PLM Consultants have. However, I believe it is too late to establish the PLM Professional role definition.

John’s focus is on providing students and others interested in PLM a broad fundamental knowledge to get into business. In their day-to-day jobs, these people will benefit from knowing the bigger context and understanding the complexity of PLM.

This is also illustrated in Product2019, where the focus is on the experience – company culture and politics.

Due to the diversity of PLM, we will never be able to define the PLM professional job role compared to the Configuration Manager. Two disciplines are crucial and needed for a sustainable, profitable enterprise.

Conclusion

In this post, we explored a third dimension of PLM Education, focusing on a foundational approach, targeting in particular students to get educated on all the aspects of PLM. John is not the only publisher of educational books. I have several others in my network who have described PLM in their wording and often in their language. Unfortunately, there is no central point of reference, and I believe we are too late for that due to the tremendous variety in PLM.

Next week I will talk with a Learning & Development leader from a company providing PLM consultancy – let’s learn how they enable their employees to support their customers. 

After a short summer break with almost no mentioning of the word PLM, it is time to continue this series of posts exploring the future of “connected” PLM. For those who also started with a cleaned-up memory, here is a short recap:

In part 1, I rush through more than 60 years of product development, starting from vellum drawings ending with the current PLM best practice for product development, the item-centric approach.

In part 2, I painted a high-level picture of the future, introducing the concept of digital platforms, which, if connected wisely, could support the digital enterprise in all its aspects. The five platforms I identified are the ERP and CRM platform (the oldest domains).

Next, the MES and PIP platform(modern domains to support manufacturing and product innovation in more detail) and the IoT platform (needed to support connected products and customers).

In part 3, I explained what is data-driven and how data-driven is closely connected to a model-based approach. Here we abandon documents (electronic files) as active information carriers. Documents will remain, however, as reports, baselines, or information containers. In this post, I ended up with seven topics related to data-driven, which I will discuss in upcoming posts.

Hopefully, by describing these topics – and for sure, there are more related topics – we will better understand the connected future and make decisions to enable the future instead of freezing the past.

 

Topic 1 for this post:

Data-driven does not imply, there needs to be a single environment, a single database that contains all information. As I mentioned in my previous post, it will be about managing connected datasets federated. It is not anymore about owned the data; it is about access to reliable data.

 

Platform or a collection of systems?

One of the first (marketing) hurdles to take is understanding what a data platform is and what is a collection of systems that work together, sold as a platform.

CIMdata published in 2017 an excellent whitepaper positioning the PIP (Product Innovation Platform):  Product Innovation Platforms: Definition, Their Role in the Enterprise, and Their Long-Term Viability. CIMdata’s definition is extensive and covers the full scope of product innovation. Of course, you can find a platform that starts from a more focused process.

For example, look at OpenBOM (focus on BOM collaboration), OnShape (focus on CAD collaboration) or even Microsoft 365 (historical, document-based collaboration).

The idea behind a platform is that it provides basic capabilities connected to all stakeholders, inside and outside your company. In addition, to avoid that these capabilities are limited, a platform should be open and able to connect with other data sources that might be either local or central available.

From these characteristics, it is clear that the underlying infrastructure of a platform must be based on a multitenant SaaS infrastructure, still allowing local data to be connected and shielded for performance or IP reasons.

The picture below describes the business benefits of a Product Innovation Platform as imagined by Accenture in 2014

Link to CIMdata’s 2014 commentary of Digital PLM HERE

Sometimes vendors sell their suite of systems as a platform. This is a marketing trick because when you want to add functionality to your PLM infrastructure, you need to install a new system and create or use interfaces with the existing systems, not really a scalable environment.

In addition, sometimes, the collaboration between systems in such a marketing platform is managed through proprietary exchange (file) formats.

A practice we have seen in the construction industry before cloud connectivity became available. However, a so-called end-to-end solution working on PowerPoint implemented in real life requires a lot of human intervention.

 

Not a single environment

There has always been the debate:

“Do I use best-in-class tools, supporting the end-user of the software, or do I provide an end-to-end infrastructure with more generic tools on top of that, focusing on ease of collaboration?”

In the system approach, the focus was most of the time on the best-in-class tools where PLM-systems provide the data governance. A typical example is the item-centric approach. It reflects the current working culture, people working in their optimized siloes, exchanging information between disciplines through (neutral) files.

The platform approach makes it possible to deliver the optimized user interface for the end-user through a dedicated app. Assuming the data needed for such an app is accessible from the current platform or through other systems and platforms.

It might be tempting as a platform provider to add all imaginable data elements to their platform infrastructure as much as possible. The challenge with this approach is whether all data should be stored in a central data environment (preferably cloud) or federated.  And what about filtering IP?

In my post PLM and Supply Chain Collaboration, I described the concept of having an intermediate hub (ShareAspace) between enterprises to facilitate real-time data sharing, however carefully filtered which data is shared in the hub.

It may be clear that storing everything in one big platform is not the future. As I described in part 2, in the end, a company might implement a maximum of five connected platforms (CRM, ERP, PIP, IoT and MES). Each of the individual platforms could contain a core data model relevant for this part of the business. This does not imply there might be no other platforms in the future. Platforms focusing on supply chain collaboration, like ShareAspace or OpenBOM, will have a value proposition too.  In the end, the long-term future is all about realizing a digital tread of information within the organization.

Will we ever reach a perfectly connected enterprise or society? Probably not. Not because of technology but because of politics and human behavior. The connected enterprise might be the most efficient architecture, but will it be social, supporting all humanity. Predicting the future is impossible, as Yuval Harari described in his book:  21 Lessons for the 21st Century. Worth reading, still a collection of ideas.

 

Proprietary data model or standards?

So far, when you are a software vendor developing a system, there is no restriction in how you internally manage your data. In the domain of PLM, this meant that every vendor has its own proprietary data model and behavior.

I have learned from my 25+ years of experience with systems that the original design of a product combined with the vendor’s culture defines the future roadmap. So even if a PLM vendor would rewrite all their software to become data-driven, the ways of working, the assumptions will be based on past experiences.

This makes it hard to come to unified data models and methodology valid for our PLM domain. However, large enterprises like Airbus and Boeing and the major Automotive suppliers have always pushed for standards as they will benefit the most from standardization.

The recent PDT conferences were an example of this, mainly the 2020 Fall conference. Several Aerospace & Defense PLM Action groups reported their progress.

You can read my impression of this event in The weekend after PLM Roadmap / PDT 2020 – part 1 and The next weekend after PLM Roadmap PDT 2020 – part 2.

It would be interesting to see a Product Innovation Platform built upon a data model as much as possible aligned to existing standards. Probably it won’t happen as you do not make money from being open and complying with standards as a software vendor. Still, companies should push their software vendors to support standards as this is the only way to get larger connected eco-systems.

I do not believe in the toolkit approach where every company can build its own data model based on its current needs. I have seen this flexibility with SmarTeam in the early days. However, it became an upgrade risk when new, overlapping capabilities were introduced, not matching the past.

In addition, a flexible toolkit still requires a robust data model design done by experienced people who have learned from their mistakes.

The benefit of using standards is that they contain the learnings from many people involved.

 

Conclusion

I did not like writing this post so much, as my primary PLM focus lies on people and methodology. Still, understanding future technologies is an important point to consider. Therefore, this time a not-so-exciting post. There is enough to read on the internet related to PLM technology; see some of the recent articles below. Enjoy

 

Matthias Ahrens shared:  Integrated Product Lifecycle Management (Google translated from German)

Oleg Shilovitsky wrote numerous articles related to technology –
in this context:
3 Challenges of Unified Platforms and System Locking and
SaaS PLM Acceleration Trends

In March 2018, I started a series of blog posts related to model-based approaches. The first post was:  Model-Based – an introduction.  The reactions to these series of posts can be summarized in two bullets:

  • Readers believed that the term model-based was focusing on the 3D CAD model. A logical association as PLM is often associated with 3D CAD-model data management (actually PDM), and in many companies, the 3D CAD model is (yet) not a major information carrier/
  • Readers were telling me that a model-based approach is too far from their day-to-day life. I have to agree here. I was active in some advanced projects where the product’s behavior depends on a combination of hardware and software. However, most companies still work in a document-driven, siloed discipline manner merging all deliverables in a BOM.

More than 3 years later, I feel that model-based approaches have become more and more visible for companies. One of the primary reasons is that companies start to collaborate in the cloud and realize the differences between a coordinated and a connected manner.

Initiatives as Industry 4.0 or concepts like the Digital Twin demand a model-based approach. This post is a follow-up to my recent post, The Future of PLM.

History has shown that it is difficult for companies to change engineering concepts. So let’s first look back at how concepts slowly changed.

The age of paper drawings

In the sixties of the previous century, the drawing board was the primary “tool” to specify a mechanical product. The drawing on its own was often a masterpiece drawn on special paper, with perspectives, details, cross-sections.

All these details were needed to transfer the part or assembly information to manufacturing. The drawing set should contain all information as there were no computers.

Making a prototype was, depending on the complexity of the product, the interpretation of the drawings and manufacturability of a product, not always that easy.  After a first release, further modifications to the product definition were often marked on the manufacturing drawings using a red pencil. Terms like blueprint and redlining come from the age of paper drawings.

There are still people talking nostalgically about these days as creating and interpreting drawings was an important skill. However, the inefficiencies with this approach were significant.

  • First, updating drawings because there was redlining in manufacturing was often not done – too much work.
  • Second, drawing reuse was almost impossible; you had to start from scratch.
  • Third, and most importantly, you needed to be very skilled in interpreting a drawing set. In particular, when dealing with suppliers that might not have the same skillset and the knowledge of which drawing version was actual.

However, paper was and still is the cheapest neutral format to distribute designs. The last time I saw companies still working with paper drawings was at the end of the previous century.

Curious to learn if they are now extinct?

The age of electronic drawings (CAD)

With the introduction of AutoCAD and personal computers around 1982, more companies started to look into drafting with the computer. There was already the IBM drafting system in 1965, but it was Autodesk that pushed the 2D drafting business with their slogan:

“80 percent of the functionality for 20 percent of the price (Autodesk 1982)”

A little later, I started to work for an Autodesk distributor/reseller. People would come to the showroom to see how a computer drawing could be plotted in the finest quality at the end. But, of course, the original draftsman did not like the computer as the screen was too small.

However, the enormous value came from making changes, the easy way of sharing drawings and the ease of reuse. The picture on the left is me in 1989, demonstrating AutoCAD with a custom-defined tablet and PS/2 computer.

The introduction of electronic drawings was not a disruption, more optimization of the previous ways of working.

The exchange with suppliers and manufacturing could still be based on plotted drawings – the most neutral format. And thanks to the filename, there was better control of versions between all stakeholders.

Aren’t we all happy?

The introduction of mainstream 3D CAD

In 1995,  3D CAD became available for the mid-market, thanks to SolidWorks, Solid Edge and a little later Inventor. Before that working with 3D CAD was only possible for companies that could afford expensive graphic stations, provided by IBM, Silicon Graphics, DEC and SUN. Where are they nowadays? The PC is an example of disruptive innovation, purely based on technology. See Clayton Christensen’s famous book: The Innovator’s Dilemma.

The introduction of 3D CAD on PCs in the mid-market did not lead directly to new ways of working. Designing a product in 3D was much more efficient if you mastered the skills. 3D brought a better understanding of the product dimensions and shape, reducing the number of interpretation errors.

Still, (electronic) drawings were the contractual deliverable when interacting with suppliers and manufacturing.  As students were more and more trained with the 3D CAD tools, the traditional art of the draftsman disappeared.

3D CAD introduced some new topics to solve.

  • First of all, a 3D CAD Assembly in the system was a collection of separate files, subassemblies, parts, and drawings that relate to each other with a specific version. So how to ensure the final assembly drawings were based on the correct part revisions? Companies were solving this by either using intelligent filenames (with revisions) or by using a PDM system where the database of the PDM system managed all the relations and their status.
  • The second point was that the 3D CAD assembly also introduced a new feature, the product structure, or the “Bill of Materials”. This logical structure of the assembly up resembled a lot of the Bill of Material of the product. You could even browse deeper levels, which was not the case in the traditional Bill of Material on a drawing.

Note: The concept of EBOM and MBOM was not known in most companies. People were talking about the BOM as a one-level definition of parts or subassemblies in the assembly. See my Where is the MBOM? Post from July 2008 when this topic was still under discussion.

  • The third point that would have a more significant impact later is that parts and assemblies could be reused in other products. This introduced the complexity of configuration management. For example, a 3D CAD part or assembly file could contain several configurations where only one configuration would be valid for the given product. Managing this in the 3D CAD system lead to higher productivity of the designer, however downstream when it came to data management with PDM systems, it became a nightmare.

I experienced these issues a lot when discussing with companies and implementers, mainly the implementation of SmarTeam combined with SolidWorks and Inventor. Where to manage the configuration constraints? In the PDM system or inside the 3D CAD system.

These environments were not friends (image above), and even if they came from the same vendor, it felt like discussing with tribes.

The third point also covered another topic. So far, CAD had been the first step for the detailed design of a product. However, companies now had an existing Bill of Material in the system thanks to the PDM systems. It could be a Bill of Material of a sub-assembly that is used in many other products.

Configuring a product no longer started from CAD; it started from a Product or Bill of Material structure. Sales and Engineers identified the changes needed on the BoM, keeping as much as possible released information untouched. This led to a new best practice.

The item-centric approach

Around 2005, five years after introducing the term Product Lifecycle Management, slowly, a new approach became the standard. Product Lifecycle Management was initially introduced to connect engineering and manufacturing, driven by the automotive and aerospace industry.

It was with PLM that concepts as EBOM and MBOM became visible.

In particular, the EBOM was closely linked to engineering practices, i.e., modularity and reuse. The EBOM and its related information represented the product as it was specified. It is essential to realize that the parts in the EBOM could be generic specified purchase parts to be resolved when producing the product or that the EBOM contained Make-parts specified by drawings.

At that time, the EBOM was often used as the foundation for the ERP system – see image above. The BOM was restructured and organized according to the manufacturing process specifying materials and resources needed in the ERP system. Therefore, although it was an item-like structure, this BOM (the MBOM) always had a close relation to the Bill of Process.

For companies with a single manufacturing site, the notion of EBOM and MBOM was not that big, as the ERP system would be the source of the MBOM. However, the complexity came when companies have several manufacturing sites. That was when a generic MBOM in the PLM system made more sense to centralize all product information in a single system.

The EBOM-MBOM approach has become more and more a standard practice since 2010. As a result, even small and medium-sized enterprises realized a need to manage the EBOM and the MBOM.

There were two disadvantages introduced with this EBOM-MBOM approach.

  • First, the EBOM and the MBOM as information structures require a lot of administrative maintenance if information needs to be always correct (and that is the CM target).  Some try to simplify this by keeping the EBOM part the same as the MBOM part, meaning the EBOM specification already targets a single supplier or manufacturer.
  • The second disadvantage of making every item in the BOM behave like a part creates inefficiencies in modern environments. Products are a mix of hardware(parts) and software(models/behavior). This BOM-centric view does not provide the proper infrastructure for a data-driven approach as part specifications are still done in drawings. We need 3D annotated models related to all kinds of other behavior and physical models to specify a product that contains hard-and software.

A new paradigm is needed to manage this mix efficiently, the enabling foundation for Industry 4.0 and efficient Digital Twins; there is a need for a model-based approach based on connected data elements.

More next week.

Conclusion

The age of paper drawings 1960 – now dead
The age of electronic drawings 1982 – potentially dead in 2030
The mainstream 3D CAD 1995 – to be evolving through MBD and MBSE to the future – not dead shortly
Item-centric approach 2005 – to be evolving to a connected model-based approach – not dead shortly

Another episode of “The PLM Doctor is IN“. This time a question from Ilan Madjar, partner and co-founder of XLM Solutions. Ilan is my co-moderator at the PLM Global Green Alliance for sustainability topics.

All these activities resulted in the following question(s) related to the Digital Twin. Now sit back and enjoy.

PLM and the Digital Twin

Is it a new concept? How to implement and certify the result?

Relevant topics discussed in this video

Conclusion

I hope you enjoyed the answer and look forward to your questions and comments. Let me know if you want to be an actor in one of the episodes.


The main rule: A (single) open question that is puzzling you related to PLM.

One of my favorite conferences is the PLM Road Map & PDT conference. Probably because in the pre-COVID days, it was the best PLM conference to network with peers focusing on PLM practices, standards, and sustainability topics. Now the conference is virtual, and hopefully, after the pandemic, we will meet again in the conference space to elaborate on our experiences further.

Last year’s fall conference was special because we had three days filled with a generic PLM update and several A&D (Aerospace & Defense) working groups updates, reporting their progress and findings. Sessions related to the Multiview BOM researchGlobal Collaboration, and several aspects of Model-Based practices: Model-Based Definition, Model-Based Engineering & Model-Based Systems engineering.

All topics that I will elaborate on soon. You can refresh your memory through these two links:

This year, it was a two-day conference with approximately 200 attendees discussing how emerging technologies can disrupt the current PLM landscape and reshape the PLM Value Equation. During the first day of the conference, we focused on technology.

On the second day, we looked in addition to the impact new technology has on people and organizations.

Today’s Emerging Trends & Disrupters

Peter Bilello, CIMdata’s President & CEO, kicked off the conference by providing CIMdata observations of the market. An increasing number of technology capabilities, like cloud, additive manufacturing, platforms, digital thread, and digital twin, all with the potential of realizing a connected vision. Meanwhile, companies evolve at their own pace, illustrating that the gap between the leaders and followers becomes bigger and bigger.

Where is your company? Can you afford to be a follower? Is your PLM ready for the future? Probably not, Peter states.

Next, Peter walked us through some technology trends and their applicability for a future PLM, like topological data analytics (TDA), the Graph Database, Low-Code/No-Code platforms, Additive Manufacturing, DevOps, and Agile ways of working during product development. All capabilities should be related to new ways of working and updated individual skills.

I fully agreed with Peter’s final slide – we have to actively rethink and reshape PLM – not by calling it different but by learning, experimenting, and discussing in the field.

Digital Transformation Supporting Army Modernization

An interesting viewpoint related to modern PLM came from Dr. Raj Iyer, Chief Information Officer for IT Reform from the US Army. Rai walked us through some of the US Army’s challenges, and he gave us some fantastic statements to think about. Although an Army cannot be compared with a commercial business, its target remains to be always ahead of the competition and be aware of the competition.

Where we would say “data is the new oil”, Rai Iyer said: “Data is the ammunition of the future fight – as fights will more and more take place in cyberspace.”

The US Army is using a lot of modern technology – as the image below shows. The big difference here with regular businesses is that it is not about ROI but about winning fights.

Also, for the US Army, the cloud becomes the platform of the future. Due to the wide range of assets, the US Army has to manage, the importance of product data standards is evident.  – Rai mentioned their contribution and adherence to the ISO 10303 STEP standard crucial for interoperability. It was an exciting insight into the US Army’s current and future challenges. Their primary mission remains to stay ahead of the competition.

Joining up Engineering Data without losing the M in PLM

Nigel Shaw’s (Eurostep) presentation was somehow philosophical but precisely to the point what is the current dilemma in the PLM domain.  Through an analogy of the internet, explaining that we live in a world of HTTP(s) linking, we create new ways of connecting information. The link becomes an essential artifact in our information model.

Where it is apparent links are crucial for managing engineering data, Nigel pointed out some of the significant challenges of this approach, as you can see from his (compiled) image below.

I will not discuss this topic further here as I am planning to come back to this topic when explaining the challenges of the future of PLM.

As Nigel said, they have a debate with one of their customers to replace the existing PLM tools or enhance the existing PLM tools. The challenge of moving from coordinated information towards connected data is a topic that we as a community should study.

Integration is about more than Model Format.

This was the presentation I have been waiting for. Mark Williams from Boeing had built the story together with Adrian Burton from Airbus. Nigel Shaw, in the previous session, already pointed to the challenge of managing linked information. Mark elaborated further about the model-based approach for system definition.

All content was related to the understanding that we need a  model-based information infrastructure for the future because storing information in documents (the coordinated approach) is no longer viable for complex systems. Mark ‘slide below says it all.

Mark stressed the importance of managing model information in context, and it has become a challenge.

Mark mentioned that 20 years ago, the IDC (International Data Corporation) measured Boeing’s performance and estimated that each employee spent 2 ½ hours per day. In 2018, the IDC estimated that this number has grown to 30 % of the employee’s time and could go up to 50 % when adding the effort of reusing and duplicating data.

The consequence of this would be that a full-service enterprise, having engineering, manufacturing and services connected, probably loses 70 % of its information because they cannot find it—an impressive number asking for “clever” ways to find the correct information in context.

It is not about just a full indexed search of the data, as some technology geeks might think. It is also about describing and standardizing metadata that describes the models. In that context, Mark walked through a list of existing standards, all with their pros and cons, ending up with the recommendation to use the ISO 10303-243 – MoSSEC standard.

MoSSEC standing for Modelling and Simulation information in a collaborative Systems Engineering Context to manage and connect the relationships between models.

MoSSEC and its implication for future digital enterprises are interesting, considering the importance of a model-based future. I am curious how PLM Vendors and tools will support and enable the standard for future interoperability and collaboration.

Additive Manufacturing
– not as simple as paper printing – yet

Andreas Graichen from Siemens Energy closed the day, coming back to the new technologies’ topic: Additive Manufacturing or in common language 3D Printing. Andreas shared their Additive Manufacturing experiences, matching the famous Gartner Hype Cycle. His image shows that real work needs to be done to understand the technology and its use cases after the first excitement of the hype is over.

Material knowledge was one of the important topics to study when applying additive manufacturing. It is probably a new area for most companies to understand the material behaviors and properties in an Additive Manufacturing process.

The ultimate goal for Siemens Energy is to reach an “autonomous” workshop anywhere in the world where gas turbines could order their spare parts by themselves through digital warehouses. It is a grand vision, and Andreas confirmed that the scalability of Additive Manufacturing is still a challenge.

For rapid prototyping or small series of spare parts, Additive Manufacturing might be the right solution. The success of your Additive Manufacturing process depends a lot on how your company’s management has realistic expectations and the budget available to explore this direction.

Conclusion

Day 1 was enjoyable and educational, starting and ending with a focus on disruptive technologies. The middle part related to data the data management concepts needed for a digital enterprise were the most exciting topics to follow up in my opinion.

Next week I will follow up with reviewing day 2 and share my conclusions. The PLM Road Map & PDT Spring 2021 conference confirmed that there is work to do to understand the future (of PLM).

 

Last summer, I wrote a series of blog posts grouped by the theme “Learning from the past to understand the future”. These posts took you through the early days of drawings and numbering practices towards what we currently consider the best practice: PLM BOM-centric backbone for product lifecycle information.

You can find an overview and links to these posts on the page Learning from the past.

If you have read these posts, or if you have gone yourself through this journey, you will realize that all steps were more or less done evolutionarily. There were no disruptions. Affordable 3D CAD systems, new internet paradigms (interactive internet),  global connectivity and mobile devices all introduced new capabilities for the mainstream. As described in these posts, the new capabilities sometimes created friction with old practices. Probably the most popular topics are the whole Form-Fit-Function interpretation and the discussion related to meaningful part numbers.

What is changing?

In the last five to ten years, a lot of new technology has come into our lives. The majority of these technologies are related to dealing with data. Digital transformation in the PLM domain means moving from a file-based/document-centric approach to a data-driven approach.

A Bill of Material on the drawing has become an Excel-like table in a PLM system. However, an Excel file is still used to represent a Bill of Material in companies that have not implemented PLM.

Another example, the specification document has become a collection of individual requirements in a system. Each requirement is a data object with its own status and content. The specification becomes a report combining all valid requirement objects.

Related to CAD, the 2D drawing is no longer the deliverable as a document; the 3D CAD model with its annotated views becomes the information carrier for engineering and manufacturing.

And most important of all, traditional PLM methodologies have been based on a mechanical design and release process. Meanwhile, modern products are systems where the majority of capabilities are defined by software. Software has an entirely different configuration and lifecycle approach conflicting with a mechanical approach, which is too rigid for software.

The last two aspects, from 2D drawings to 3D Models and Mechanical products towards Systems (hardware and software), require new data management methods.  In this environment, we need to learn to manage simulation models, behavior models, physics models and 3D models as connected as possible.

I wrote about these changes three years ago:  Model-Based – an introduction, which led to a lot of misunderstanding (too advanced – too hypothetical).

I plan to revisit these topics in the upcoming months again to see what has changed over the past three years.

What will I discuss in the upcoming weeks?

My first focus is on participating and contributing to the upcoming PLM Roadmap  & PDS spring 2021 conference. Here speakers will discuss the need for reshaping the PLM Value Equation due to new emerging technologies. A topic that contributes perfectly to the future of PLM series.

My contribution will focus on the fact that technology alone cannot disrupt the PLM domain. We also have to deal with legacy data and legacy ways of working.

Next, I will discuss with Jennifer Herron from Action Engineering the progress made in Model-Based Definition, which fits best practices for today – a better connection between engineering and manufacturing. We will also discuss why Model-Based Definition is a significant building block required for realizing the concepts of a digital enterprise, Industry 4.0 and digital twins.

Another post will focus on the difference between the digital thread and the digital thread. Yes, it looks like I am writing twice the same words. However, you will see based on its interpretation, one definition is hanging on the past, the other is targeting the future. Again here, the differentiation is crucial if the need for a maintainable Digital Twin is required.

Model-Based Systems Engineering (MBSE) in all its aspects needs to be discussed too. MBSE is crucial for defining complex products. Model-Based Systems Engineering is seen as a discipline to design products. Understanding data management related to MBSE will be the foundation for understanding data management in a Model-Based Enterprise. For example, how to deal with configuration management in the future?

 

Writing Learning from the past was an easy job as explaining with hindsight is so much easier if you have lived it through. I am curious and excited about the outcome of “The Future of PLM”. Writing about the future means you have digested the information coming to you, knowing that nobody has a clear blueprint for the future of PLM.

There are people and organizations are working on this topic more academically, for example read this post from Lionel Grealou related to the Place of PLM in the Digital Future. The challenge is that an academic future might be disrupted by unpredictable events, like COVID, or disruptive technologies combined with an opportunity to succeed. Therefore I believe, it will be a learning journey for all of us where we need to learn to give technology a business purpose. Business first – then technology.

 

No Conclusion

Normally I close my post with a conclusion. At this moment. there is no conclusion as the journey has just started. I look forward to debating and learning with practitioners in the field. Work together on methodology and concepts that work in a digital enterprise. Join me on this journey. I will start sharing my thoughts in the upcoming months

 

 

 

Regularly (young) individuals approach me looking for advice to start or boost their PLM career. One of the questions the PLM Doctor is IN quickly could answer.

Before going further on this topic, there is also the observation that many outspoken PLM experts are “old.” Meanwhile, all kinds of new disruptive technologies are comping up.

Can these old guys still follow and advise on all trends/hypes?

My consultant’s answer is: “Yes and No” or “It depends”.

The answer illustrates the typical nature of a consultant. It is almost impossible to give a binary answer; still, many of my clients are looking for binary answers. Generalizing further, you could claim: “Human beings like binary answers”, and then you understand what is happening now in the world.

The challenge for everyone in the PLM domain is to keep an open mindset and avoid becoming binary. Staying non-binary means spending time to digest what you see, what you read or what you hear. Ask yourself always the question: Is it so simple? Try to imagine how the content you read fits in the famous paradigm: People, Processes and Tools. It would help if you considered all these aspects.

Learning by reading

I was positively surprised by Helena Gutierrez’s post on LinkedIn: The 8 Best PLM blogs to follow. First of all, Helena’s endorsement, explaining the value of having non-academic PLM information available as a foundation for her learnings in PLM.

And indeed, perhaps I should have written a book about PLM. However, it would be a book about the past. Currently, PLM is not stable; we are learning every day to use new technologies and new ways of working. For example, the impact and meaning of model-based enterprise.

However, the big positive surprise came from the number of likes within a few days, showing how valuable this information is for many others on their PLM journey. I am aware there are more great blogs out in the field, sometimes with the disadvantage that they are not in English and therefore have a limited audience.

Readers of this post, look at the list of 8 PLM blogs and add your recommended blog(s) in the comments.

Learning by reading (non-binary) is a first step in becoming or staying up to date.

Learning by listening

General PLM conferences have been an excellent way to listen to other people’s experiences in the past. Depending on the type of conference, you would be able to narrow your learning scope.

This week I started my preparation for the upcoming PLM Roadmap and PDT conference. Here various speakers will provide their insight related to “disruption,” all in the context of disruptive technologies for PLM.

Good news, also people and business aspects will be part of the conference.

Click on the image for the agenda and registration

My presentation with the title: DISRUPTION – EXTINCTION or still EVOLUTION? I will address all these aspects. We have entered a decisive decade to prove we can disrupt our old habits to save the planet for future generations.

It is challenging to be interactive as a physical conference; it is mainly a conference to get inspired or guided in your thinking about new PLM technologies and potential disruption.

Learning by listening and storing the content in your brain is the second step in becoming or staying up to date.

Learning by discussing

One of the best learnings comes from having honest discussions with other people who all have different backgrounds. To be part of such a discussion, you need to have at least some basic knowledge about the topic. This avoids social media-like discussions where millions of “experts” have an opinion behind the keyboard. (The Dunning-Kruger effect)

There are two upcoming discussions I want to highlight here.

1. Book review: How to Avoid a Climate Disaster.

On Thursday, May 13th, I will moderate a PLM Global Green Alliance panel discussion on Zoom to discuss Bill Gates’ book: “How to Avoid a Climate Disaster”. As you can imagine, Bill Gates is not known as a climate expert, more as a philanthrope and technology geek. However, the reviews are good.

What can we learn from the book as relevant for our PLM Global Green Alliance?

If you want to participate, read all the details on our PGGA website.

The PGGA core team members, Klaus Brettschneider, Lionel Grealou, Richard McFall, Ilan Madjar and Hannes Lindfred, have read the book.

 

2. The Modular Way Questions & Answers

In my post PLM and Modularity, I announced the option for readers of “The Modular Way” to ask the authors (Björn Eriksson & Daniel Strandhammar) or provide feedback on the book together with a small audience. This session is also planned to take place in May and to be scheduled based on the participants’ availability. At this moment, there are still a few open places. Therefore if you have read the book and want to participate, send an email to tacit@planet.nl or info@brickstrategy.com.

Learning by discussing is the best way to enrich your skills, particularly if you have Active Listening skills – crucial to have for a good discussion.

 

Conclusion

No matter where you are in your career, in the world of PLM, learning never stops. Twenty years of experience have no value if you haven’t seen the impact of digitalization coming. Make sure you learn by reading, by listening and by discussing.

Another episode of “The PLM Doctor is IN“. This time a question from Rob Ferrone. Rob is one of the founders of QuickRelease, a passionate, no-nonsense PDM/PLM consultancy company focusing on process improvement.

Now sit back and enjoy.

PLM and Digital Plumbing
What’s inside the digital plumber’s toolbox?

Relevant topic discussed in this video

Inside this video you see a slide from Marc Halpern (Gartner), depicting the digital thread during the last PLM Roadmap – PDT conference – fall 2020. This conference is THE place for more serious content and I am happy to announce my participation and anxiety for the next upcoming PLM Roadmap – PDT conference on May 19-20.

The theme: DISRUPTION—the PLM Professionals’ Exploration of Emerging Technologies that Will Reshape the PLM Value Equation.

Looking forward to seeing you there.

Conclusion

I hope you enjoyed the answer and look forward to your questions and comments. Let me know if you want to be an actor in one of the episodes.
The main rule: A single open question that is puzzling you related to PLM.

Translate

Categories

  1. If it was easy, anyone could do it. It's hard. It's supposed to be hard. Quote inspired by Tom Hanks…

%d bloggers like this: