You are currently browsing the category archive for the ‘Education’ category.

In my previous post, “My PLM Bookshelf,” on LinkedIn, I shared some of the books that influenced my thinking related to PLM. As you can see in the LinkedIn comments, other people added their recommendations for PLM-related books to get inspired or more knowledgeable.

 

Where reading a book is a personal activity, now I want to share with you how to get educated in a more interactive manner related to PLM. In this post, I talk with Peter Bilello, President & CEO of CIMdata. If you haven’t heard about CIMdata and you are active in PLM, more to learn on their website HERE. Now let us focus on Education.

CIMdata

Peter, knowing CIMdata from its research valid for the whole PLM community, I am curious to learn what is the typical kind of training CIMdata is providing to their customers.

Jos, throughout much of CIMdata’s existence, we have delivered educational content to the global PLM industry. With a core business tenant of knowledge transfer, we began offering a rich set of PLM-related tutorials at our North American and pan-European conferences starting in the earlier 1990s.

Since then, we have expanded our offering to include a comprehensive set of assessment-based certificate programs in a broader PLM sense. For example, systems engineering and digital transformation-related topics. In total, we offer more than 30 half-day classes. All of which can be delivered in-person as a custom configuration for a specific client and through public virtual-live or in-person classes. We have certificated more than 1,000 PLM professionals since the introduction in 2009 of this PLM Leadership offering.

Based on our experience, we recommend that an organization’s professional education strategy and plans address the organization’s specific processes and enabling technologies. This will help ensure that it drives the appropriate and consistent operations of its processes and technologies.

For that purpose, we expanded our consulting offering to include a comprehensive and strategic digital skills transformation framework. This framework provides an organization with a roadmap that can define the skills an organization’s employees need to possess to ensure a successful digital transformation.

In turn, this framework can be used as an efficient tool for the organization’s HR department to define its training and job progression programs that align with its overall transformation.

 

The success of training

We are both promoting the importance of education to our customers. Can you share with us an example where Education really made a difference? Can we talk about ROI in the context of training?

Jos, I fully agree. Over the years, we have learned that education and training are often minimized (i.e., sub-optimized). This is unfortunate and has usually led to failed or partially successful implementations.

In our view, both education and training are needed, along with strong organizational change management (OCM) and a quality assurance program during and after the implementation.

In our terms, education deals with the “WHY” and training with the “HOW”. Why do we need to change? Why do we need to do things differently? And then “HOW” to use new tools within the new processes.

We have seen far too many failed implementations where sub-optimized decisions were made due to a lack of understanding (i.e., a clear lack of education). We have also witnessed training and education being done too early or too late.

This leads to a reduced Return on Investment (ROI).

Therefore a well-defined skills transformation framework is critical for any company that wants to grow and thrive in the digital world. Finally, a skills transformation framework needs to be tied directly to an organization’s digital implementation roadmap and structure, state of the process, and technology maturity to maximize success.

 

Training for every size of the company?

When CIMdata conducts PLM training, is there a difference, for example, when working with a big global enterprise or a small and medium enterprise?

You might think the complexity might be similar; however, the amount of internal knowledge might differ. So how are you dealing with that?

We basically find that the amount of training/education required mostly depends on the implementation scope. Meaning the scope of the proposed digital transformation and the current maturity level of the impacted user community.

It is important to measure the current maturity and establish appropriate metrics to measure the success of the training (e.g., are people, once trained, using the tools correctly).

CIMdata has created a three-part PLM maturity model that allows an organization to understand its current PLM-related organizational, process, and technology maturity.

The three-part PLM maturity model

The PLM maturity model provides an important baseline for identifying and/or developing the appropriate courses for execution.

This also allows us, when we are supporting the definition of a digital skills transformation framework, to understand how the level of internal knowledge might differ within and between departments, sites, and disciplines. All of which help define an organization-specific action plan, no matter its size.

 

Where is CIMdata training different?

Most of the time, PLM implementers offer training too for their prospects or customers. So, where is CIMdata training different?

 

For this, it is important to differentiate between education and training. So, CIMdata provides education (the why) and training and education strategy development and planning.

We don’t provide training on how to use a specific software tool. We believe that is best left to the systems integrator or software provider.

While some implementation partners can develop training plans and educational strategies, they often fall short in helping an organization to effectively transform its user community. Here we believe training specialists are better suited.

 

Digital Transformation and PLM

One of my favorite topics is the impact of digitization in the area of product development. CIMdata introduced the Product Innovation Platform concept to differentiate from traditional PDM/PLM. Who needs to get educated to understand such a transformation, and what does CIMdata contribute to this understanding.

We often start with describing the difference between digitalization and digitization. This is crucial to be understood by an organization’s management team. In addition, management must understand that digitalization is an enterprise initiative.

It isn’t just about product development, sales, or enabling a new service experience. It is about maximizing a company’s ROI in applying and leveraging digital as needed throughout the organization. The only way an organization can do this successfully is by taking an end-to-end approach.

The Product Innovation Platform is focused on end-to-end product lifecycle management. Therefore, it must work within the context of other enterprise processes that are focused on the business’s resources (i.e., people, facilities, and finances) and on its transactions (e.g., purchasing, paying, and hiring).

As a result, an organization must understand the interdependencies among these domains. If they don’t, they will ultimately sub-optimize their investment. It is these and other important topics that CIMdata describes and communicates in its education offering.

The Product Innovation Platform in a digital enterprise

More than Education?

As a former teacher, I know that a one-time education, a good book or slide deck, is not enough to get educated. How does CIMdata provide a learning path or coaching path to their customers?

Jos, I fully agree. Sustainability of a change and/or improved way of working (i.e., long-term sustainability) is key to true and maximized ROI. Here I am referring to the sustainability of the transformation, which can take years.

With this, organizational change management (OCM) is required. OCM must be an integral part of a digital transformation program and be embedded into a program’s strategy, execution, and long-term usage. That means training, education, communication, and reward systems all have to be managed and executed on an ongoing basis.

For example, OCM must be executed alongside an organization’s digital skills transformation program. Our OCM services focus on strategic planning and execution support. We have found that most companies understand the importance of OCM, often don’t fully follow through on it.

 

A model-based future?

During the CIMdata Roadmap & PDT conferences, we have often discussed the importance of Model-Based Systems Engineering methodology as a foundation of a model-based enterprise. What do you see? Is it only the big Aerospace and Defense companies that can afford this learning journey, or should other industries also invest? And if yes, how to start.

Jos, here I need to step back for a minute. All companies have to deal with increasing complexity for their organization, supply chain, products, and more.

So, to optimize its business, an organization must understand and employ systems thinking and system optimization concepts. Unfortunately, most people think of MBSE as an engineering discipline. This is unfortunate because engineering is only one of the systems of systems that an organization needs to optimize across its end-to-end value streams.

The reality is all companies can benefit from MBSE. As long as they consider optimization across their specific disciplines, in the context of their products and services and where they exist within their value chain.

The MBSE is not just for Aerospace and Defense companies. Still, a lot can be learned from what has already been done. Also, leading automotive companies are implementing and using MBSE to design and optimize semi- and high-automated vehicles (i.e., systems of systems).

The starting point is understanding your systems of systems environment and where bottlenecks exist.

There should be no doubt, education is needed on MBSE and how MBSE supports the organization’s Model-Based Enterprise requirements.

Published work from the CIMdata administrated A&D PLM Action Group can be helpful. Also, various MBE and systems engineering maturity models, such as one that CIMdata utilizes in its consulting work.

Want to learn more?

Thanks, Peter, for sharing your insights. Are there any specific links you want to provide to get educated on the topics discussed? Perhaps some books to read or conferences to visit?

x
Jos, as you already mentioned:

x

  • the CIMdata Roadmap & PDT conferences have provided a wealth of insight into this market for more than 25 years.
    [Jos: Search for my blog posts starting with the text: “The weekend after ….”]
  • In addition, there are several blogs, like yours, that are worth following, and websites, like CIMdata’s pages for education or other resources which are filled with downloadable reading material.
  • Additionally, there are many user conferences from PLM solution providers and third-party conferences, such as those hosted by the MarketKey organization in the UK.

These conferences have taken place in Europe and North America for several years. Information exchange and formal training and education are offered in many events. Additionally, they provide an excellent opportunity for networking and professional collaboration.

What I learned

Talking with Peter made me again aware of a few things. First, it is important to differentiate between education and training. Where education is a continuous process, training is an activity that must take place at the right time. Unfortunately, we often mix those two terms and believe that people are educated after having followed a training.

Secondly, investing in education is as crucial as investing in hard- or software. As Peter mentioned:

We often start with describing the difference between digitalization and digitization. This is crucial to be understood by an organization’s management team. In addition, management must understand that digitalization is an enterprise initiative.

System Thinking is not just an engineering term; it will be a mandate for managing a company, a product and even a planet into the future

Conclusion

This time a quote from Albert Einstein, supporting my PLM coaching intentions:

“Education is not the learning of facts
but the training of the mind to think.”

 

After two quiet weeks of spending time with my family in slow motion, it is time to start the year.

First of all, I wish you all a happy, healthy, and positive outcome for 2022, as we need energy and positivism together. Then, of course, a good start is always cleaning up your desk and only leaving the relevant things for work on the desk.

Still, I have some books at arm’s length, either physical or on my e-reader, that I want to share with you – first, the non-obvious ones:

The Innovators Dilemma

A must-read book was written by Clayton Christensen explaining how new technologies can overthrow established big companies within a very short period. The term Disruptive Innovation comes up here. Companies need to remain aware of what is happening outside and ready to adapt to your business. There are many examples even recently where big established brands are gone or diminished in a short period.

In his book, he wrote about DEC (Digital Equipment Company)  market leader in minicomputers, not having seen the threat of the PC. Or later Blockbuster (from video rental to streaming), Kodak (from analog photography to digital imaging) or as a double example NOKIA (from paper to market leader in mobile phones killed by the smartphone).

The book always inspired me to be alert for new technologies, how simple they might look like, as simplicity is the answer at the end. I wrote about in 2012: The Innovator’s Dilemma and PLM, where I believed cloud, search-based applications and Facebook-like environments could disrupt the PLM world. None of this happened as a disruption; these technologies are now, most of the time, integrated by the major vendors whose businesses are not really disrupted. Newcomers still have a hard time to concur marketspace.

In 2015 I wrote again about this book, The Innovator’s dilemma and Generation change. – image above. At that time, understanding disruption will not happen in the PLM domain. Instead, I predict there will be a more evolutionary process, which I would later call: From Coordinated to Connected.

The future ways of working address the new skills needed for the future. You need to become a digital native, as COVID-19 pushed many organizations to do so. But digital native alone does not bring success. We need new ways of working which are more difficult to implement.

Sapiens

The book Sapiens by Yuval Harari made me realize the importance of storytelling in the domain of PLM and business transformation. In short, Yuval Harari explains why the human race became so dominant because we were able to align large groups around an abstract theme. The abstract theme can be related to religion, the power of a race or nation, the value of money, or even a brand’s image.

The myth (read: simplified and abstract story) hides complexity and inconsistencies. It allows everyone to get motivated to work towards one common goal. A Yuval says: “Fiction is far more powerful because reality is too complex”.

Too often, I have seen well-analyzed PLM projects that were “killed” by management because it was considered too complex. I wrote about this in 2019  PLM – measurable or a myth? claiming that the real benefits of PLM are hard to predict, and we should not look isolated only to PLM.

My 2020 follow-up post The PLM ROI Myth, eludes to that topic. However, even if you have a soundproof business case at the management level, still the myth might be decisive to justify the investment.

That’s why PLM vendors are always working on their myths: the most cost-effective solution, the most visionary solution, the solution most used by your peers and many other messages to influence your emotions, not your factual thinking. So just read the myths on their websites.

If you have no time to read the book, look at the above 2015 Ted to grasp the concept and use it with a PLM -twisted mind.

Re-use your CAD

In 2015, I read this book during a summer holiday (meanwhile, there is a second edition). Although it was not a PLM book, it was helping me to understand the transition effort from a classical document-driven enterprise towards a model-based enterprise.

Jennifer Herron‘s book helps companies to understand how to break down the (information) wall between engineering and manufacturing.

At that time, I contacted Jennifer to see if others like her and Action Engineering could explain Model-Based Definition comprehensively, for example, in Europe- with no success.

As the Model-Based Enterprise becomes more and more the apparent future for companies that want to be competitive or benefit from the various Digital Twin concepts. For that reason, I contacted Jennifer again last year in my post: PLM and Model-Based Definition.

As you can read, the world has improved, there is a new version of the book, and there is more and more information to share about the benefits of a model-based approach.

I am still referencing Action Engineering and their OSCAR learning environment for my customers. Unfortunately, many small and medium enterprises do not have the resources and skills to implement a model-based environment.

Instead, these companies stay on their customers’ lowest denominator: the 2D Drawing. For me, a model-based definition is one of the first steps to master if your company wants to provide digital continuity of design and engineering information towards manufacturing and operations. Digital twins do not run on documents; they require model-based environments.

The book is still on my desk, and all the time, I am working on finding the best PLM practices related to a Model-Based enterprise.

It is a learning journey to deal with a data-driven, model-based environment, not only for PLM but also for CM experts, as you might have seen from my recent dialogue with CM experts: The future of Configuration Management.

Products2019

This book was an interesting novelty published by John Stark in 2020. John is known for his academic and educational books related to PLM. However, during the early days of the COVID-pandemic, John decided to write a novel. The novel describes the learning journey of Jane from Somerset, who, as part of her MBA studies, is performing a research project for the Josef Mayer Maschinenfabrik. Her mission is to report to the newly appointed CEO what happens with the company’s products all along the lifecycle.

Although it is not directly a PLM book, the book illustrates the complexity of PLM. It Is about people and culture; many different processes, often disconnected. Everyone has their focus on their particular discipline in the center of importance. If you believe PLM is all about the best technology only, read this book and learn how many other aspects are also relevant.

I wrote about the book in 2020: Products2019 – a must-read if you are new to PLM if you want to read more details. An important point to pick up from this book is that it is not about PLM but about doing business.

PLM is not a magical product. Instead, it is a strategy to support and improve your business.

System Lifecycle Management

Another book, published a little later and motivated by the extra time we all got during the COVID-19 pandemic, was Martin Eigner‘s book System Lifecycle Management.

A 281-page journey from the early days of data management towards what Martin calls System Lifecycle Management (SysLM). He was one of the first to talk about System Lifecycle Management instead of PLM.

I always enjoyed Martin’s presentations at various PLM conferences where we met. In many ways, we share similar ideas. However, during his time as a professor at the University of Kaiserslautern (2003-2017), he explored new concepts with his students.

I briefly mentioned the book in my series The road to model-based and connected PLM (Part 5) when discussing SLM or SysLM. His academic research and analysis make this book very valuable. It takes you in a very structured way through the times that mechatronics becomes important, next the time that systems (hardware and software) become important.

We discussed in 2015 the applicability of the bimodal approach for PLM. However, as many enterprises are locked in their highly customized PDM/PLM environments, their legacy blocks the introduction of modern model-based and connected approaches.

Where John Stark’s book might miss the PLM details, Martin’s book brings you everything in detail and with all its references.

It is an interesting book if you want to catch up with what has happened in the past 20 years.

More Books …..

More books on my desk have helped me understand the past or that helped me shape the future. As this is a blog post, I will not discuss more books this time reaching my 1500 words.

Still books worthwhile to read – click on their images to learn more:

I discussed this book two times last year. An introduction in PLM and Modularity and a discussion with the authors and some readers of the book: The Modular Way – a follow-up discussion

x

x

A book I read this summer contributed to a better understanding of sustainability. I mentioned this book in my presentation for the Swedish CATIA Forum in October last year – slide 29 of The Challenges of model-based and traditional plm. So you could see it as an introduction to System Thinking from an economic point of view.

System Thinking becomes crucial for a sustainable future, as I addressed in my post PLM and Sustainability.

Sustainability is my area of interest at the PLM Green Global Alliance, an international community of professionals working with Product Lifecycle Management (PLM) enabling technologies and collaborating for a more sustainable decarbonized circular economy.

Conclusion

There is a lot to learn. Tell us something about your PLM bookshelf – which books would you recommend. In the upcoming posts, I will further focus on PLM education. So stay tuned and keep on learning.

As promised in my early November post – The road to model-based and connected PLM (part 9 – CM), I come back with more thoughts and ideas related to the future of configuration management. Moving from document-driven ways of working to a data-driven and model-based approach fundamentally changes how you can communicate and work efficiently.

Let’s be clear: configuration management’s target is first of all about risk management. Ensuring your company’s business remains sustainable, efficient, and profitable.

By providing the appropriate change processes and guidance,  configuration management either avoids costly mistakes and iterations during all phases of a product lifecycle or guarantees the quality of the product and information to ensure safety.

Companies that have not implemented CM practices probably have not observed these issues. Or they have not realized that the root cause of these issues is a lack of CM.

Similar to what is said in smaller companies related to PLM, CM is often seen as an overhead, as employees believe they thoroughly understand their products. In addition, CM is seen as a hurdle to innovation because of the standardization of practices. So yes, they think it is normal that there are sometimes problems. That’s life.

I already wrote about this topic in 2010 PLM, CM and ALM – not sexy 😦 – where ALM means Asset Lifecycle Management – my focus at that time.

Hear it from the experts

To shape the discussion related to the future of Configuration Management, I had a vivid discussion with three thought leaders in this field: Lisa Fenwick, Martijn Dullaart and Maxime Gravel. A short introduction of the three of them:

Lisa Fenwick, VP Product Development at CMstat, a leading company in Configuration Management and Data Management software solutions and consulting services for aviation, aerospace & defense, marine, and other high-tech industries. She has over 25 years of experience with CM and Deliverables Management, including both government and commercial environments.

Ms. Fenwick has achieved CMPIC SME, CMPIC CM Assessor, and CMII-C certifications. Her experience includes implementing CM software products, CM-related consulting and training, and participation in the SAE and IEEE standards development groups

Martijn Dullaart is the Lead Architect for Enterprise Configuration Management at ASML (Our Dutch national pride) and chairperson of the Industry 4.0 committee of the Institute  Process Excellence (IPX) Congress. Martijn has his own blog mdux.net, and you might have seen him recently during the PLM Roadmap & PDT Fall conference in November – his thoughts about the CM future can be found on his blog here

Maxime Gravel, Manager Model-Based Engineering at Moog Inc., a worldwide designer, manufacturer, and integrator of advanced motion control products. Max has been the director of the model-based enterprise at the Institute for Process Excellence (IPX) and Head of Configuration and Change Management at Gulfstream Aerospace which certified the first aircraft in a 3D Model-Based Environment.

What we discussed:

We had an almost one-hour discussion related to the following points:

  • The need for Enterprise Configuration Management – why and how
  • The needed change from document-driven to model-based – the impact on methodology and tools
  • The “neural network” of data – connecting CM to all other business domains, a similar view as from the PLM domain,

I kept from our discussion the importance of planning – as seen in the CMstat image on the left.

To plan which data you need to manage and how you will manage the data. How often are you doing this in your company’s projects?

Next, all participants stressed the importance of education and training on this topic – get educated. Configuration Management is not a topic that is taught at schools. Early next year, I will come back on education as the benefits of education are often underestimated. Not everything can be learned by “googling.”

 Conclusion

The journey towards a model-based and data-driven future is not a quick one to be realized by new technologies. However, it is interesting to learn that the future of connected data (the “neural network”) allows organizations to implement both CM and PLM in a similar manner, using graph databases and automation. When executed at the enterprise level, the result will be that CM and PLM become natural practices instead of other siloed system-related disciplines.

Most of the methodology is there; the implementation to make it smooth and embedded in organizations will be the topics to learn. Join us in discussing and learning!

 

This week I attended the PLM Roadmap & PDT Fall 2021 with great expectations based on my enthusiasm last year. Unfortunately, the excitement was less this time, and I will explain in my conclusions why. This time it was unfortunate again a virtual event which makes it hard to be interactive, something I realize I am missing a lot.

Over two hundred attendees connected for the two days, and you can find the agenda here. Typically I would discuss the relevant sessions; now, I want to group some of them related to a theme, as there was complementary information in these sessions.

Disruption

Again like in the spring, the theme was focusing on DISRUPTION. The word disruption can give you an uncomfortable feeling when you are not in power. It is more fun to disrupt than to be disrupted, as I mentioned in my spring presentation. Read The week after PLM Roadmap & PDT Spring 2021

In his keynote speech Peter Bilello (CIMdata) kicked off with: The Critical Dozen: 12 familiar, evolving trends and enablers of digital transformation that you cannot or should not live without.

You can see them on the slide below:

I believe many of them should be familiar to you as these themes have been “in the air” already for quite some time. Vendors first and slowly companies start to investigate them when relevant. You will find many of them back in my recent series: The road to model-based and connected PLM, where I explored the topics that would cross your path on that journey.

Like Peter said: “For most of the topics you cannot pick and choose as they are all connected.”

Another interesting observation was that we are more and more moving away from the concept of related structures (digital thread) but more to connected datasets (digital web). Marc Halpern first introduced this topic last year at the 2020 conference and has become an excellent image to frame what we should imagine in a connected world.

Digital web also has to do with the uprise of the graph database mentioned by Peter Bilello as a potentially disruptive technology during the fireside chat. Relational databases can be seen as rigid, associated with PLM structures. On the other hand, graph databases can be associated with flexible relations between different types of data – the image of the digital web.

Where Peter was mainly telling WHAT was happening, two presentations caught my attention because of the HOW.

First of all, Dr. Rodney Ewing (Cummins) ‘s session: A Balanced Strategy to Reap Continuous Business Value from Digital PLM was a great story of a transformational project. It contained both having a continuous delivery of business value in mind while moving to the connected enterprise.

As Rodney mentioned, the contribution of TCS was crucial here, which I can imagine. It is hard for a company to understand what is happening in the outside (PLM) world when applying it to your company. Their transformation roadmap is an excellent example of having the long-term vision in mind, meanwhile delivering value during the transformation.

Talking about the right partner and synergy, the second presentation I liked in this context of disruption was Ian Quest’s presentation (Quick Release): Open-source Disruption in Support of Audacious Goals. As a sponsor of the conference, they had ten minutes to pitch their area of expertise.

After Ian’s presentation, focused on audacious goals (for non-English natives translated as “brave” goals), there was only one word that stuck to my mind: pragmatic.

Instead of discussions about the complexity, Ian gave examples of where a pragmatic data-centric approach could lead to great benefits, as you can see from one of the illustrated benefits below:

Standards

A characteristic topic of this conference is that we always talk about standards. Torbjörn Holm (Eurostep) gave an excellent overview of where standards have led to significant benefits. For example, the containerization of goods has dramatically improved transportation of goods (we all benefit) while killing proprietary means of transport (trains, type of ships, type of unloading).  See the image below:

Torbjörn rightfully expanded this story to the current situation in the construction industry or the challenges for asset operators. Unfortunately, in these practices, many content suppliers remain focusing on their unique capabilities, reluctantly neglecting the demand for interoperability among the whole value chain.

It is a topic Marc Halpern also mentioned last year as an outcome of their Gartner PLM benefits survey. Gartner’s findings:

Time to Market is not so much improved by using PLM as the inefficient interaction with suppliers is the impediment.

Like transport before containerization, the exchange of information is not standardized and designed for digital exchange. Torbjorn believes that more and more companies will insist on exchange standards –  like CHIFOS – an ISO1596-derived exchange standard in the process industry. It is a user-driven standard, the best standard.

In this context, the presentation from Kenny Swope (Boeing) and Jean Yves Delaunay (Airbus) The Business Value of Standards-based Information Interoperability for Aerospace & Defense illustrated this fact.

While working for competitors, the Aerospace industry understands the criticality of standards to become more efficient and less vendor-dependent.  In the aerospace & defense group, they discuss these themes. The last year’s 2020 Fall sessions showed the results. You can read their publications here

The A&D PLM action group uses the following framework when evaluating standards – as you can see on the image below:

The result – and this is a combined exercise of many participating experts from the field; this is their recommendation:

To conclude:
People often complain about standards, framed by proprietary data format vendors, that they lead to a rigid environment, blocking agility.

In reality, standards allow companies to be more agile as the (proprietary) data flow is less an issue. Remember the containerization example.

Sustainability and System Thinking

This conference has always been known for its attention to the circular economy and green thinking. In the past, these topics might have been considered disconnected from our PLM practices; now, they have become a part of everyone’s mission.

Two presentations stood out on this topic for me. First, Ken Webster, with his keynote speech: In the future, you will own nothing and you will be happy was a significant oversight of how we as consumers currently are disconnected from the circular economy. His plea, as shown below, for making manufacturers responsible for the legal ownership of the materials in the products they deliver would impact consumer behavior.

Product as a Service (PaaS) and new ways to provide a service is becoming essential. For example, buildings as power stations, as they are a place to collect solar or wind energy?

His thoughts are aligned with what is happening in Europe related to the European Green Deal (not in his presentation). There is a push for a PaaS model for all products as this would be an excellent stimulant for the circular economy.  PaaS combined with a Digital Product Passport – more on that next year.

Making upgrades to your products has less impact on the environment than creating new products to sell (and creating waste of the old product).  Ken Webster was an interesting statement about changing the economy – do we want to own products or do we want to benefit from the product and leave the legal ownership to the manufacturer.

A topic I discussed in the PLM Roadmap & PDT Conference Spring 2021 – look here at slide 11

Patrick Hillberg‘s presentation Rising to the challenge of engineering and optimizing . . . what?  was the one closest to my heart. We discussed Sustainability and Systems Thinking with Patrick in our PLM Global Green Alliance, being pretty aligned on this topic.  Patrick started by explaining the difference between Systems Engineering and Systems Thinking. Looking at the product go-to-market of an organization is more than the traditional V-model. Economic pressure and culture will push people to deviate from the ideal technological plan due to other priorities.

Expanding on this observation, Partick stated that there are limits to growth, a topic discussed by many people involved in the sustainable economy. Economic growth is impossible on a limited planet, and we have to take more dimensions into account. Patrick gave some examples of that, including issues related to the infamous Boeing 737 Max example.

For Patrick, the COVID-pandemic is the end of the old 2nd Industrial Revolution and a push for a new Fourth Industrial Revolution, which is not only technical, as the slide below indicates.

With Patrick, I believe we are at a decisive moment to disrupt ourselves, reconsider many things we do and are used to doing. Even for PLM practitioners, this is a new path to go.

Data

There were two presentations related to digitization and the shift from document-based to a data-driven approach.

First, there was Greg Weaver (Gulfstream) with his presentation Indexing Content – Finding Your Needle in the Haystack. Greg explained that by using indexation of existing document-based information combined with a specific dashboard, they could provide fast access to information that otherwise would have been hidden in so many document or even paper archives.

It was a pragmatic solution, making me feel nostalgic seeing the SmarTeam profile cards. It was an excellent example of moving to a digital enterprise, and Gulfstream has always been a front runner on this topic.

Warning: Don’t use this by default at home (your company). The data in a regulated industry like Aerospace is expected to be of high quality due to the configuration management processes in place. If your company does not have a strong CM practice, the retrieved data might be inaccurate.

Martijn Dullaart (ASML)’s presentation The Next disruption, please…..  was the next step into the future. With his statement “No CM = No Trust,” he made an essential point for data-driven environments.

There is a need for Configuration Management, and I touched on this topic in my last post: The road to model-based and connected PLM (part 9 – CM).

Martijn’s presentation can also be found on his blog here, and I encourage you to read it (saving me copy & paste text). It was interesting to see that Martijn improved his CM pyramid, as you can see, more discipline and activity-oriented instead of a system view. With Martijn and others, I will elaborate on this topic soon.

Conclusion

This has been an extremely long post, and thanks for reading until the end. Many interesting topics were presented at the conference. I was less excited this time because many of these topics are triggers for a discussion. Innovation comes from meeting people with different backgrounds. In a live conference, you would meet during the break or during the famous dinner. How can we ensure we follow up on all this interesting information.

Your thoughts? Contact me for a Corona Friday discussion.

When I started this series in July, I expected to talk mostly about new ways of working, enabled through a data-driven and model-based approach. However, when analyzing what is needed for such a future (part 3), it became apparent that many of these new ways of working are dependent on technology.

From coordinated to connected sounds like a business change;

however, it all depends on technology. And here I have to thank Marc Halpern (Gartner’s Research VP, Engineering and Design Technologies)  again, who came with this brilliant scheme below:

So now it is time to address the last point from my starting post:

Configuration Management requires a new approach. The current methodology is very much based on hardware products with labor-intensive change management. However, the world of software products has different configuration management and change procedures. Therefore, we need to merge them into a single framework. Unfortunately, this cannot be the BOM framework due to the dynamics in software changes.

Configuration management at this moment

PLM and CM are often considered overlapping. My March 2019 post: PLM and Configuration Management – a happy marriage? shares some thoughts related to this point

Does having PLM or PDM installed mean you have implemented CM? There is this confusion because revision management is considered the same as configuration management. Read my March 2020 post: What the FFF is happening? Based on a vivid discussion launched by  Yoann Maingon, CEO and founder of Ganister, an example of a modern, graph database-based, flexible PLM solution.

To hear it from a CM-side,  I discussed it with Martijn Dullaart in my February 2021 post: PLM and Configuration Management. We also zoomed in on CM2 in this post as a methodology.

Martijn is the Lead Architect for Enterprise Configuration Management at ASML (Our Dutch national pride) and chairperson of the Industry 4.0 committee of the Integrated Process Excellence (IPX) Congress.

As mentioned before in a previous post (part 6), he will be speaking at the PLM Roadmap & PDT Fall conference starting this upcoming week.

In this post, I want to talk about the CM future. For understanding the current situation, you can find a broad explanation here on Wikipedia. Have a look at CM in the context of the product lifecycle, ensuring that the product As-Specified and As-Designed information matches the As-Built and As-Operated product information.

A mismatch or inconsistency between these artifacts can lead to costly errors, particularly in later lifecycle stages. CM originated from the Aerospace and Defense industry for that reason. However, companies in other industries might have implemented CM practices too. Either due to regulations or thanks to the understanding that configuration mistakes can cause significant damage to the company.

Historically configuration management addresses the needs of “slow-moving” products. For example, the design of an airplane could take years before manufacturing started. Tracking changes and ensuring consistency of all referenced datasets was often a manual process.

On purpose, I wrote “referenced datasets,” as the information was not connected in a single environment most of the time. The identifier of a dataset ( an item or a document) was the primary information carrier used for mentally connecting other artifacts to keep consistency.

The Institute of Process Excellence (IPX) has been one of the significant contributors to configuration management methodology. They have been providing (and still offer) CM2 training and certification.

As mentioned before, PLM vendors or implementers suggest that a PLM system could fully support Configuration Management. However, CM is more than change management, release management and revision management.

As the diagram from Martijn Dullaart shows, PLM is one facet of configuration management.

Of course, there are also (a few) separate CM tools focusing on the configuration management process. CMstat’s EPOCH CM tool is an example of such software. In addition, on their website, you can find excellent articles explaining the history and their future thoughts related to CM.

The future will undoubtedly be a connected, model-based, software-driven environment. Naturally, therefore, configuration management processes will have to change. (Impressive buzz word sentence, still I hope you get the message).

From coordinated to connected has a severe impact on CM. Let’s have a look at the issues.

Configuration Management – the future

The transition to a data-driven and model-based infrastructure has raised the following questions:

  • How to deal with the granularity of data – each dataset needs to be validated. For example, a document (a collection of datasets) needs to be validated in the document-based approach. How to do this efficiently?
  • The behavior of a product (or system) will more and more dependent on software. Product CM practices have been designed for the hardware domain; now, we need a mix of hardware and software CM practices.
  • Due to the increased complexity of products (or systems) and the rapid changes due to software versions, how do we guarantee the As-Operated product is still matching the As-Designed / As-Certified definitions.

I don’t have answers to these questions. I only share observations and trends I see in my actual world.

Granularity of data

The concept of datasets has been discussed in my post (part 6). Now it is about how to manage the right sets of connected data.

The image on the left, borrowed from Erik Herzog’s presentation at the PDM Roadmap & PDT Fall conference in 2020, is a good illustration of the challenge.

At that time, Erik suggested that OSLC could be the enabler of a digital CM backbone for an enterprise. Therefore, it was a pleasure to see Erik providing an update at the yearly OSLC Fest conference this week.

You can find the agenda and Erik’s presentation here on day 2.

OSLC as a framework seems to be a good candidate for supporting modern CM scenarios. It allows a company to build full traceability between all relevant artifacts (if digital available). I can see the beauty of the technical infrastructure.

Still, it is about people and processes first. Therefore, I am curious to learn from my readers who believe and experiment with such a federated infrastructure.

More software

Traditional working companies might believe that software should be treated as part of the Bill of Materials. In this theory, you treat software code as a part, with a part number and revision. In this way, you might believe configuration management practices do not have to change. However, there are some fundamental differences in why we should decouple hardware and software.

First, for the same hardware solution, there might be a whole collection of valid software codes. Just like your computer. How many valid software codes, even from the same application, can you run on this hardware? Managing a computer system and its software through a Bill of Materials is unimaginable.

A computer, of course, is designed for running all kinds of software versions. However, modern products in the field, like cars, machines, electrical devices, all will have a similar type of software-driven flexibility.

For that reason, I believe that companies that deliver software-driven products should design a mechanism to check if the combination of hardware and software is valid. For a computer system, a software mismatch might not be costly or painful; for an industrial system, it might be crucial to ensure invalid combinations can exist. Click on the image to learn more.

Solutions like Configit or pure::variants might lead to a solution. In Feb 2021, I discussed in PLM and Configuration Lifecycle Management with Henrik Hulgaard, the CTO from Configit, the unique features of their solution.

I hope to have a similar post shortly with Pure Systems to understand their added value to configuration management.

Software change management is entirely different from hardware change management. The challenge is to have two different change management approaches under one consistent umbrella without creating needless overhead.

Increased complexity – the digital twin?

With the increased complexity of products and many potential variants of a solution, how can you validate a configuration? Perhaps we should investigate the digital twin concept, with a twin for each instance we want to validate.

Having a complete virtual representation of a product, including the possibility to validate the software behavior on the virtual product, would allow you to run (automated) validation tests to certify and later understand a product in the field.

No need for inspection on-site or test and fix upgrades in the physical world. Needed for space systems for sure, but why not for every system in the long term. When we are able to define and maintain a virtual twin of our physical product (on-demand), we can validate.

I learned about this concept at the 2020 Digital Twin conference in the Netherlands. Bart Theelen from Canon Production Printing explained that they could feed their simulation models with actual customer data to simulate and analyze the physical situation. In some cases, it is even impossible to observe the physical behavior. By tuning the virtual environment, you might understand what happens in the physical world.

An eye-opener and an advocate for the model-based approach. Therefore, I am looking forward to the upcoming PLM Roadmap & PDT Fall conference. Hopefully, Martijn Dullaart will share his thoughts on combining CM and working in a model-based environment. See you there?

Conclusion

Finally, we have reached in this series the methodology part, particularly the one related to configuration management and traceability in a very granular, digital environment.  

After the PLM Roadmap & PDT fall conference, I plan to follow up with three thought leaders on this topic: Martijn Dullaart (ASML), Maxime Gravel (Moog) and Lisa Fenwick (CMstat).  What would you ask them?

My previous post introducing the concept of connected platforms created some positive feedback and some interesting questions. For example, the question from Maxime Gravel:

Thank you, Jos, for the great blog. Where do you see Change Management tool fit in this new Platform ecosystem?

is one of the questions I try to understand too. You can see my short comment in the comments here. However, while discussing with other experts in the CM-domain, we should paint the path forward. Because if we cannot solve this type of question, the value of connected platforms will be disputable.

It is essential to realize that a digital transformation in the PLM domain is challenging. No company or vendor has the perfect blueprint available to provide an end-to-end answer for a connected enterprise. In addition, I assume it will take 10 – 20 years till we will be familiar with the concepts.

It takes a generation to move from drawings to 3D CAD. It will take another generation to move from a document-driven, linear process to data-driven, real-time collaboration in an iterative manner.  Perhaps we can move faster, as the Automotive, Aerospace & Defense, and Industrial Equipment industries are not the most innovative industries at this time. Other industries or startups might lead us faster into the future.

Although I prefer discussing methodology, I believe before moving into that area, I need to clarify some more technical points before moving forward. My apologies for writing it in such a simple manner. This information should be accessible for the majority of readers.

What means data-driven?

I often mention a data-driven environment, but what do I mean precisely by that. For me, a data-driven environment means that all information is stored in a dataset that contains a single aspect of information in a standardized manner, so it becomes accessible by outside tools.

A document is not a dataset, as often it includes a collection of datasets. Most of the time, the information it is exposed to is not standardized in such a manner a tool can read and interpret the exact content. We will see that a dataset needs an identifier, a classification, and a status.

An identifier to be able to create a connection between other datasets – traceability or, in modern words, a digital thread.
A classification as the classification identifier will determine the type of information the dataset contains and potential a set of mandatory attributes

A status to understand if the dataset is stable or still in work.

Examples of a data-driven approach – the item

The most common dataset in the PLM world is probably the item (or part) in a Bill of Material. The identifier is the item number (ID + revision if revisions are used). Next, the classification will tell you the type of part it is.

Part classification can be a topic on its own, and every industry has its taxonomy.

Finally, the status is used to identify if the dataset is shareable in the context of other information (released, in work, obsolete), allowing tools to expose only relevant information.

In a data-driven manner, a part can occur in several Bill of Materials – an example of a single definition consumed in other places.

When the part information changes, the accountable person has to analyze the relations to the part, which is easy in a data-driven environment. It is normal to find this functionality in a PDM or ERP system.

When the part would change in a document-driven environment, the effort is much higher.

First, all documents need to be identified where this part occurs. Then the impact of change needs to be managed in document versions, which will lead to other related changes if you want to keep the information correct.

Examples of a data-driven approach – the requirement

Another example illustrating the benefits of a data-driven approach is implementing requirements management, where requirements become individual datasets.  Often a product specification can contain hundreds of requirements, addressing the needs of different stakeholders.

In addition, several combinations of requirements need to be handled by other disciplines, mechanical, electrical, software, quality and legal, for example.

As requirements need to be analyzed and ranked, a specification document would never be frozen. Trade-off analysis might lead to dropping or changing a single requirement. It is almost impossible to manage this all in a document, although many companies use Excel. The disadvantages of Excel are known, in particular in a dynamic environment.

The advantage of managing requirements as datasets is that they can be grouped. So, for example, they can be pushed to a supplier (as a specification).

Or requirements could be linked to test criteria and test cases, without the need to manage documents and make sure you work with them last updated document.

As you will see, also requirements need to have an Identifier (to manage digital relations), a classification (to allow grouping) and a status (in work / released /dropped)

Data-driven and Models – the 3D CAD model

3D PDF Model

When I launched my series related to the model-based approach in 2018, the first comments I got came from people who believed that model-based equals the usage of 3D CAD models – see Model-based – the confusion. 3D Models are indeed an essential part of a model-based infrastructure, as the 3D model provides an unambiguous definition of the physical product. Just look at how most vendors depict the aspects of a virtual product using 3D (wireframe) models.

Although we use a 3D representation at each product lifecycle stage, most companies do not have a digital continuity for the 3D representation. Design models are often too heavy for visualization and field services support. The connection between engineering and manufacturing is usually based on drawings instead of annotated models.

I wrote about modern PLM and Model-Based Definition, supported by Jennifer Herron from Action Engineering – read the post PLM and Model-Based Definition here.

If your company wants to master a data-driven approach, this is one of the most accessible learning areas. You will discover that connecting engineering and manufacturing requires new technology, new ways of working and much more coordination between stakeholders.

Implementing Model-Based Definition is not an easy process. However, it is probably one of the best steps to get your digital transformation moving. The benefits of connected information between engineering and manufacturing have been discussed in the blog post PLM and Model-Based Definition

Essential to realize all these exciting capabilities linked to Industry 4.0 require a data-driven, model-based connection between engineering and manufacturing.

If this is not the case, the projected game-changers will not occur as they become too costly.

Data-driven and mathematical models

To manage complexity, we have learned that we have to describe the behavior in models to make logical decisions. This can be done in an abstract model, purely based on mathematical equations and relations. For example, suppose you look at climate models, weather models or COVID infections models.

In that case, we see they all lead to discussions from so-called experts that believe a model should be 100 % correct and any exception shows the model is wrong.

It is not that the model is wrong; the expectations are false.

For less complex systems and products, we also use models in the engineering domain. For example, logical models and behavior models are all descriptive models that allow people to analyze the behavior of a product.

For example, how software code impacts the product’s behavior. Usually, we speak about systems when software is involved, as the software will interact with the outside world.

There can be many models related to a product, and if you want to get an impression, look at this page from the SEBoK wiki: Types of Models. The current challenge is to keep the relations between these models by sharing parameters.

The sharable parameters then again should be datasets in a data-driven environment. Using standardized diagrams, like SysML or UML,  enables the used objects in the diagram to become datasets.

I will not dive further into the modeling details as I want to remain at a high level.

Essential to realize digital models should connect to a data-driven infrastructure by sharing relevant datasets.

What does data-driven imply?

 

I want to conclude this time with some statements to elaborate on further in upcoming posts and discussions

  1. Data-driven does not imply there needs to be a single environment, a single database that contains all information. Like I mentioned in my previous post, it will be about managing connected datasets in a federated manner. It is not anymore about owned the data; it is about access to reliable data.
  2. Data-driven does not mean we do not need any documents anymore. Read electronic files for documents. Likely, document sets will still be the interface to non-connected entities, suppliers, and regulatory bodies. These document sets can be considered a configuration baseline.
  3. Data-driven means that we need to manage data in a much more granular manner. We have to look different at data ownership. It becomes more data accountability per role as the data can be used and consumed throughout the product lifecycle.
  4. Data-driven means that you need to have an enterprise architecture, data governance and a master data management (MDM) approach. So far, the traditional PLM vendors have not been active in the MDM domain as they believe their proprietary data model is leading. Read also this interesting McKinsey article: How enterprise architects need to evolve to survive in a digital world
  5. A model-based approach with connected datasets seems to be the way forward. Managing data in documents will become inefficient as they cannot contribute to any digital accelerator, like applying algorithms. Artificial Intelligence relies on direct access to qualified data.
  6. I don’t believe in Low-Code platforms that provide ad-hoc solutions on demand. The ultimate result after several years might be again a new type of spaghetti. On the other hand, standardized interfaces and protocols will probably deliver higher, long-term benefits. Remember: Low code: A promising trend or a Pandora’s Box?
  7. Configuration Management requires a new approach. The current methodology is very much based on hardware products with labor-intensive change management. However, the world of software products has different configuration management and change procedure. Therefore, we need to merge them in a single framework. Unfortunately, this cannot be the BOM framework due to the dynamics in software changes. An interesting starting point for discussion can be found here: Configuration management of industrial products in PDM/PLM

 

Conclusion

Again, a long post, slowly moving into the future with many questions and points to discuss. Each of the seven points above could be a topic for another blog post, a further discussion and debate.

After my summer holiday break in August, I will follow up. I hope you will join me in this journey by commenting and contributing with your experiences and knowledge.

 

 

 

 

One of my favorite conferences is the PLM Road Map & PDT conference. Probably because in the pre-COVID days, it was the best PLM conference to network with peers focusing on PLM practices, standards, and sustainability topics. Now the conference is virtual, and hopefully, after the pandemic, we will meet again in the conference space to elaborate on our experiences further.

Last year’s fall conference was special because we had three days filled with a generic PLM update and several A&D (Aerospace & Defense) working groups updates, reporting their progress and findings. Sessions related to the Multiview BOM researchGlobal Collaboration, and several aspects of Model-Based practices: Model-Based Definition, Model-Based Engineering & Model-Based Systems engineering.

All topics that I will elaborate on soon. You can refresh your memory through these two links:

This year, it was a two-day conference with approximately 200 attendees discussing how emerging technologies can disrupt the current PLM landscape and reshape the PLM Value Equation. During the first day of the conference, we focused on technology.

On the second day, we looked in addition to the impact new technology has on people and organizations.

Today’s Emerging Trends & Disrupters

Peter Bilello, CIMdata’s President & CEO, kicked off the conference by providing CIMdata observations of the market. An increasing number of technology capabilities, like cloud, additive manufacturing, platforms, digital thread, and digital twin, all with the potential of realizing a connected vision. Meanwhile, companies evolve at their own pace, illustrating that the gap between the leaders and followers becomes bigger and bigger.

Where is your company? Can you afford to be a follower? Is your PLM ready for the future? Probably not, Peter states.

Next, Peter walked us through some technology trends and their applicability for a future PLM, like topological data analytics (TDA), the Graph Database, Low-Code/No-Code platforms, Additive Manufacturing, DevOps, and Agile ways of working during product development. All capabilities should be related to new ways of working and updated individual skills.

I fully agreed with Peter’s final slide – we have to actively rethink and reshape PLM – not by calling it different but by learning, experimenting, and discussing in the field.

Digital Transformation Supporting Army Modernization

An interesting viewpoint related to modern PLM came from Dr. Raj Iyer, Chief Information Officer for IT Reform from the US Army. Rai walked us through some of the US Army’s challenges, and he gave us some fantastic statements to think about. Although an Army cannot be compared with a commercial business, its target remains to be always ahead of the competition and be aware of the competition.

Where we would say “data is the new oil”, Rai Iyer said: “Data is the ammunition of the future fight – as fights will more and more take place in cyberspace.”

The US Army is using a lot of modern technology – as the image below shows. The big difference here with regular businesses is that it is not about ROI but about winning fights.

Also, for the US Army, the cloud becomes the platform of the future. Due to the wide range of assets, the US Army has to manage, the importance of product data standards is evident.  – Rai mentioned their contribution and adherence to the ISO 10303 STEP standard crucial for interoperability. It was an exciting insight into the US Army’s current and future challenges. Their primary mission remains to stay ahead of the competition.

Joining up Engineering Data without losing the M in PLM

Nigel Shaw’s (Eurostep) presentation was somehow philosophical but precisely to the point what is the current dilemma in the PLM domain.  Through an analogy of the internet, explaining that we live in a world of HTTP(s) linking, we create new ways of connecting information. The link becomes an essential artifact in our information model.

Where it is apparent links are crucial for managing engineering data, Nigel pointed out some of the significant challenges of this approach, as you can see from his (compiled) image below.

I will not discuss this topic further here as I am planning to come back to this topic when explaining the challenges of the future of PLM.

As Nigel said, they have a debate with one of their customers to replace the existing PLM tools or enhance the existing PLM tools. The challenge of moving from coordinated information towards connected data is a topic that we as a community should study.

Integration is about more than Model Format.

This was the presentation I have been waiting for. Mark Williams from Boeing had built the story together with Adrian Burton from Airbus. Nigel Shaw, in the previous session, already pointed to the challenge of managing linked information. Mark elaborated further about the model-based approach for system definition.

All content was related to the understanding that we need a  model-based information infrastructure for the future because storing information in documents (the coordinated approach) is no longer viable for complex systems. Mark ‘slide below says it all.

Mark stressed the importance of managing model information in context, and it has become a challenge.

Mark mentioned that 20 years ago, the IDC (International Data Corporation) measured Boeing’s performance and estimated that each employee spent 2 ½ hours per day. In 2018, the IDC estimated that this number has grown to 30 % of the employee’s time and could go up to 50 % when adding the effort of reusing and duplicating data.

The consequence of this would be that a full-service enterprise, having engineering, manufacturing and services connected, probably loses 70 % of its information because they cannot find it—an impressive number asking for “clever” ways to find the correct information in context.

It is not about just a full indexed search of the data, as some technology geeks might think. It is also about describing and standardizing metadata that describes the models. In that context, Mark walked through a list of existing standards, all with their pros and cons, ending up with the recommendation to use the ISO 10303-243 – MoSSEC standard.

MoSSEC standing for Modelling and Simulation information in a collaborative Systems Engineering Context to manage and connect the relationships between models.

MoSSEC and its implication for future digital enterprises are interesting, considering the importance of a model-based future. I am curious how PLM Vendors and tools will support and enable the standard for future interoperability and collaboration.

Additive Manufacturing
– not as simple as paper printing – yet

Andreas Graichen from Siemens Energy closed the day, coming back to the new technologies’ topic: Additive Manufacturing or in common language 3D Printing. Andreas shared their Additive Manufacturing experiences, matching the famous Gartner Hype Cycle. His image shows that real work needs to be done to understand the technology and its use cases after the first excitement of the hype is over.

Material knowledge was one of the important topics to study when applying additive manufacturing. It is probably a new area for most companies to understand the material behaviors and properties in an Additive Manufacturing process.

The ultimate goal for Siemens Energy is to reach an “autonomous” workshop anywhere in the world where gas turbines could order their spare parts by themselves through digital warehouses. It is a grand vision, and Andreas confirmed that the scalability of Additive Manufacturing is still a challenge.

For rapid prototyping or small series of spare parts, Additive Manufacturing might be the right solution. The success of your Additive Manufacturing process depends a lot on how your company’s management has realistic expectations and the budget available to explore this direction.

Conclusion

Day 1 was enjoyable and educational, starting and ending with a focus on disruptive technologies. The middle part related to data the data management concepts needed for a digital enterprise were the most exciting topics to follow up in my opinion.

Next week I will follow up with reviewing day 2 and share my conclusions. The PLM Road Map & PDT Spring 2021 conference confirmed that there is work to do to understand the future (of PLM).

 

Last Friday, we discussed with several members of the PLM Global Green Alliance the book: “How to avoid a Climate Disaster” written by Bill Gates. I was happy to moderate the session between Klaus Brettschneider, Rich McFall, Lionel Grealou, Ilan Madjar and Patrick Hillberg. From the LinkedIn profiles of each of them, you can see we are all active in the domain of PLM. And they have read the book upfront before the discussion.

I think the book addresses climate change in a tangible manner. Bill Gates brings structure into addressing climate changes and encourages you to be active. What you can do as an individual, as a citizen. My only comment to this book would be that as a typical nerd, Bill Gates does not understand so much human behavior, understanding people’s emotions that might lead to non-logical behavior.

When you browse through the book’s reviews, for example, on Goodreads, you see the extreme, rating from 1 to 5. Some people believe that Bill Gates, due to his wealth and ways of living, is not allowed to write this book. Other like the transparent and pragmatic approach discussing the related themes in the book.

Our perspective

Klaus, Rich, Lio, Ilan and Patrick did not have extreme points of view – so don’t watch the recording if you are looking for anxiety. They reviewed How to Avoid a Climate Disaster from their perspective and how it could be relevant for PLM practitioners.  It became a well-balanced dialogue. You can watch or listen to the recording following this link:

Book discussion: How to avoid a climate disaster written by Bill Gates

Note: we will consolidate all content on our PLMGreenAlliances website to ensure nothing is lost – feel free to comment/discuss further.

More on sustainability

If you want to learn more about all sorts of disruption, not only disruption caused by climate change, have a look at the upcoming conference this week: DISRUPTION—the PLM Professionals’ Exploration of Emerging Technologies that Will Reshape the PLM Value Equation.

My contribution will be on day 2, where I combine disruptive technology with the need to become really sustainable in our businesses.

It will be a call for action from our PLM community. In the coming nine years, we have to change our business, become sustainable and use the relevant new technologies. This requires system thinking – will mankind being able to deal with so many different parameters.

Conclusion

Start the dialogue with us, the PLM Global Green Alliance, by watching and reading content from the website. Or become an active member participating in discussion sessions related to any relevant topic for our alliance. More to come at the end of May, you too?

 

 

 

 

 

 

 

 

Regularly (young) individuals approach me looking for advice to start or boost their PLM career. One of the questions the PLM Doctor is IN quickly could answer.

Before going further on this topic, there is also the observation that many outspoken PLM experts are “old.” Meanwhile, all kinds of new disruptive technologies are comping up.

Can these old guys still follow and advise on all trends/hypes?

My consultant’s answer is: “Yes and No” or “It depends”.

The answer illustrates the typical nature of a consultant. It is almost impossible to give a binary answer; still, many of my clients are looking for binary answers. Generalizing further, you could claim: “Human beings like binary answers”, and then you understand what is happening now in the world.

The challenge for everyone in the PLM domain is to keep an open mindset and avoid becoming binary. Staying non-binary means spending time to digest what you see, what you read or what you hear. Ask yourself always the question: Is it so simple? Try to imagine how the content you read fits in the famous paradigm: People, Processes and Tools. It would help if you considered all these aspects.

Learning by reading

I was positively surprised by Helena Gutierrez’s post on LinkedIn: The 8 Best PLM blogs to follow. First of all, Helena’s endorsement, explaining the value of having non-academic PLM information available as a foundation for her learnings in PLM.

And indeed, perhaps I should have written a book about PLM. However, it would be a book about the past. Currently, PLM is not stable; we are learning every day to use new technologies and new ways of working. For example, the impact and meaning of model-based enterprise.

However, the big positive surprise came from the number of likes within a few days, showing how valuable this information is for many others on their PLM journey. I am aware there are more great blogs out in the field, sometimes with the disadvantage that they are not in English and therefore have a limited audience.

Readers of this post, look at the list of 8 PLM blogs and add your recommended blog(s) in the comments.

Learning by reading (non-binary) is a first step in becoming or staying up to date.

Learning by listening

General PLM conferences have been an excellent way to listen to other people’s experiences in the past. Depending on the type of conference, you would be able to narrow your learning scope.

This week I started my preparation for the upcoming PLM Roadmap and PDT conference. Here various speakers will provide their insight related to “disruption,” all in the context of disruptive technologies for PLM.

Good news, also people and business aspects will be part of the conference.

Click on the image for the agenda and registration

My presentation with the title: DISRUPTION – EXTINCTION or still EVOLUTION? I will address all these aspects. We have entered a decisive decade to prove we can disrupt our old habits to save the planet for future generations.

It is challenging to be interactive as a physical conference; it is mainly a conference to get inspired or guided in your thinking about new PLM technologies and potential disruption.

Learning by listening and storing the content in your brain is the second step in becoming or staying up to date.

Learning by discussing

One of the best learnings comes from having honest discussions with other people who all have different backgrounds. To be part of such a discussion, you need to have at least some basic knowledge about the topic. This avoids social media-like discussions where millions of “experts” have an opinion behind the keyboard. (The Dunning-Kruger effect)

There are two upcoming discussions I want to highlight here.

1. Book review: How to Avoid a Climate Disaster.

On Thursday, May 13th, I will moderate a PLM Global Green Alliance panel discussion on Zoom to discuss Bill Gates’ book: “How to Avoid a Climate Disaster”. As you can imagine, Bill Gates is not known as a climate expert, more as a philanthrope and technology geek. However, the reviews are good.

What can we learn from the book as relevant for our PLM Global Green Alliance?

If you want to participate, read all the details on our PGGA website.

The PGGA core team members, Klaus Brettschneider, Lionel Grealou, Richard McFall, Ilan Madjar and Hannes Lindfred, have read the book.

 

2. The Modular Way Questions & Answers

In my post PLM and Modularity, I announced the option for readers of “The Modular Way” to ask the authors (Björn Eriksson & Daniel Strandhammar) or provide feedback on the book together with a small audience. This session is also planned to take place in May and to be scheduled based on the participants’ availability. At this moment, there are still a few open places. Therefore if you have read the book and want to participate, send an email to tacit@planet.nl or info@brickstrategy.com.

Learning by discussing is the best way to enrich your skills, particularly if you have Active Listening skills – crucial to have for a good discussion.

 

Conclusion

No matter where you are in your career, in the world of PLM, learning never stops. Twenty years of experience have no value if you haven’t seen the impact of digitalization coming. Make sure you learn by reading, by listening and by discussing.

For a year, we are now used to virtual events. PI PLMx 2020 in London was my last real event where I met people. When rereading my post about this event (the weekend after PI PLMx), I wrote that it was not a technology festival. Many presentations were about business change and how to engage people in an organization.

The networking discussions during the event and evenings were the most valuable parts of the conference.

And then came COVID-19. ☹

Shortly after, in April 2020, I participated in the TECHNIA Innovation Forum, which was the first virtual conference with a setup like a conference. A main stage, with live sessions, virtual booths, and many prerecorded sessions related to various PLM topics.

You can read my experience related to the conference in two posts: the weekend after PLMIF and My four picks from PLMIF. A lot of content available for 30 days. However, I was missing the social interaction, the people.

My favourite conference for 2020 was the CIMdata PLM Roadmap / PDT Fall 2020 conference in November. The PLM Roadmap/PDT conferences are not conferences for a novice audience; you have to be skilled in the domain of PLM most of the time with a strong presence from Aerospace and Defense companies.

The Fall 2020 theme: “Digital Thread—the PLM Professionals’ Path to Delivering Innovation, Efficiency, and Quality” might sound like a marketing term.

We hear so many times the words Digital Thread and Digital Twin. However, this conference was with speakers, active practitioners, from the field.  I wrote about this conference in two posts: The weekend after PLM Roadmap / PDT 2020 – Part 1 and Part 2. I enjoyed the conference; however, I was missing social interaction.

The Digital Twin

Beyond the marketing hype, there is still a lot to learn and discuss from each other. First of all, it is not about realizing a digital twin; a business need should be the driver to investigate the possibility of a digital twin.

I am preparing a longer blog post on this topic to share learnings from people in the field. For example, in November 2020, I participated in the Netherlands in a Digital Twin Conference, focusing on real-life cases.

Companies shared their vision and successes.  It was clear that we are all learning to solve pieces of the big puzzle; there are small successes. However, without marketing language, this type of event becomes extremely helpful for further discussion and follow-up.

Recently, I enjoyed the panel discussions during the PI DX Spotlight session: Digital Twin-Driven Design. The PI DX Spotlight sessions are a collection of deep dives in various themes – have a look for the upcoming schedule here.

In the Digital Twin-Driven Design session, I enjoyed the session: What does a Digital Twin mean to your Business and Defining Requirements?

The discussion was moderated by Peter Bilello, with three interesting panellists with different industrial backgrounds. (Click on the image for the details). I have to re-watch some of the Spotlight sessions (the beauty of a virtual event) to see how they fit in the planned Digital Twin post.

 

 

The Cenit/Keonys Innovation day

On March 23rd (this Tuesday), Cenit & Keonys launch their virtual Innovation Day, another event that, before COVID-19, would have been a real people event. I am mentioning this event in particular, as I was allowed to interview fifteen of their customers about their day-to-day work, PLM-related plans, and activities.

All these interviews have been recorded and processed in such a manner that within 5 to 8 minutes, you get an understanding of what people are doing.

To prepare for these interviews, I spoke with each of them before the interview. I wanted to understand the passion for their work and where our interests overlap.

I will not mention the individual interviews in this post, as I do not want to spoil the event. I talked with various startups (do they need PLM?)  and established companies that started a PLM journey. I spoke with simulation experts (the future) and dimensional management experts (listen to these interviews to understand what it means). And ultimately, I interviewed a traditional porcelain family brand using 3D printing and 3D design, and at the other end, the German CIO of the year from 2020

(if you Google a little, you will easily find the companies involved here)

The most common topics discussed were:

  • What was the business value of your PLM-related activity?
  • Did COVID-19 impact your business?
  • What about a cloud-based solution, and how do people align?
  • If relevant, what are your experiences with a Model-Based Definition?
  • What about sustainability?

I hope you will take the opportunity to register and watch these interviews as, for me, they were an excellent opportunity to be in touch with the reality in the field. As always, we keep on learning.

The Modular Way

Talking about learning. This week, I finished the book The Modular Way, written by Bjorn Eriksson & Daniel Strandhammar.  During the lockdown last year, Bjorn & Daniel, founders of the Brick Strategy, decided to write down their experiences with mainly Scandinavian companies into a coherent framework to achieve modularization.

Modularity is a popular topic in many board meetings. How often have you heard:  “We want to move from Engineering To Order to more Configure To Order”? Or another related incentive: “We need to be cleverer with our product offering and reduced the number of different parts”.

Next, the company buys a product that supports modularity, and management believes the work has been done. Of course, not. Modularity requires a thoughtful strategy.

Illustration from the book: The Modular Way

The book can be a catalyst for such companies that want to invest in modularity but do not know where and how to start. The book is not written academically. It is more a story taking you along the steps needed to define, implement, and maintain modularity. Every step has been illustrated by actual cases and their business motivation and achieved benefits where possible. I plan to come back with Bjorn and Daniel in a dedicated post related to PLM and Modularity.

Conclusion

Virtual Events are probably part of our new future. A significant advantage is the global reach of such events. Everyone can join from anywhere connected around the world. Besides the larger events, I look forward to discovering more small and targeted discussion events like PI DX Spotlights. The main challenge for all – keep it interactive and social.

Let us know your favourite virtual event !!

Translate

Categories

  1. As a complement, even if more and more of the diversity of a product is managed at the software level…

  2. 1) A wiring diagram stores information (wires between ports of the electrical components) that does not exist in most of…

  3. BOM has NEVER been the sole "master" of the Product. The DEFINITION FILE is ! For example the wiring of…

%d bloggers like this: