You are currently browsing the category archive for the ‘Change’ category.
In March 2018, I started a series of blog posts related to model-based approaches. The first post was: Model-Based – an introduction. The reactions to these series of posts can be summarized in two bullets:
- Readers believed that the term model-based was focusing on the 3D CAD model. A logical association as PLM is often associated with 3D CAD-model data management (actually PDM), and in many companies, the 3D CAD model is (yet) not a major information carrier/
- Readers were telling me that a model-based approach is too far from their day-to-day life. I have to agree here. I was active in some advanced projects where the product’s behavior depends on a combination of hardware and software. However, most companies still work in a document-driven, siloed discipline manner merging all deliverables in a BOM.
More than 3 years later, I feel that model-based approaches have become more and more visible for companies. One of the primary reasons is that companies start to collaborate in the cloud and realize the differences between a coordinated and a connected manner.
Initiatives as Industry 4.0 or concepts like the Digital Twin demand a model-based approach. This post is a follow-up to my recent post, The Future of PLM.
History has shown that it is difficult for companies to change engineering concepts. So let’s first look back at how concepts slowly changed.
The age of paper drawings
In the sixties of the previous century, the drawing board was the primary “tool” to specify a mechanical product. The drawing on its own was often a masterpiece drawn on special paper, with perspectives, details, cross-sections.
All these details were needed to transfer the part or assembly information to manufacturing. The drawing set should contain all information as there were no computers.
Making a prototype was, depending on the complexity of the product, the interpretation of the drawings and manufacturability of a product, not always that easy. After a first release, further modifications to the product definition were often marked on the manufacturing drawings using a red pencil. Terms like blueprint and redlining come from the age of paper drawings.
There are still people talking nostalgically about these days as creating and interpreting drawings was an important skill. However, the inefficiencies with this approach were significant.
- First, updating drawings because there was redlining in manufacturing was often not done – too much work.
- Second, drawing reuse was almost impossible; you had to start from scratch.
- Third, and most importantly, you needed to be very skilled in interpreting a drawing set. In particular, when dealing with suppliers that might not have the same skillset and the knowledge of which drawing version was actual.
However, paper was and still is the cheapest neutral format to distribute designs. The last time I saw companies still working with paper drawings was at the end of the previous century.
Curious to learn if they are now extinct?
The age of electronic drawings (CAD)
With the introduction of AutoCAD and personal computers around 1982, more companies started to look into drafting with the computer. There was already the IBM drafting system in 1965, but it was Autodesk that pushed the 2D drafting business with their slogan:
“80 percent of the functionality for 20 percent of the price (Autodesk 1982)”
A little later, I started to work for an Autodesk distributor/reseller. People would come to the showroom to see how a computer drawing could be plotted in the finest quality at the end. But, of course, the original draftsman did not like the computer as the screen was too small.
However, the enormous value came from making changes, the easy way of sharing drawings and the ease of reuse. The picture on the left is me in 1989, demonstrating AutoCAD with a custom-defined tablet and PS/2 computer.
The introduction of electronic drawings was not a disruption, more optimization of the previous ways of working.
The exchange with suppliers and manufacturing could still be based on plotted drawings – the most neutral format. And thanks to the filename, there was better control of versions between all stakeholders.
Aren’t we all happy?
The introduction of mainstream 3D CAD
In 1995, 3D CAD became available for the mid-market, thanks to SolidWorks, Solid Edge and a little later Inventor. Before that working with 3D CAD was only possible for companies that could afford expensive graphic stations, provided by IBM, Silicon Graphics, DEC and SUN. Where are they nowadays? The PC is an example of disruptive innovation, purely based on technology. See Clayton Christensen’s famous book: The Innovator’s Dilemma.
The introduction of 3D CAD on PCs in the mid-market did not lead directly to new ways of working. Designing a product in 3D was much more efficient if you mastered the skills. 3D brought a better understanding of the product dimensions and shape, reducing the number of interpretation errors.
Still, (electronic) drawings were the contractual deliverable when interacting with suppliers and manufacturing. As students were more and more trained with the 3D CAD tools, the traditional art of the draftsman disappeared.
3D CAD introduced some new topics to solve.
- First of all, a 3D CAD Assembly in the system was a collection of separate files, subassemblies, parts, and drawings that relate to each other with a specific version. So how to ensure the final assembly drawings were based on the correct part revisions? Companies were solving this by either using intelligent filenames (with revisions) or by using a PDM system where the database of the PDM system managed all the relations and their status.
- The second point was that the 3D CAD assembly also introduced a new feature, the product structure, or the “Bill of Materials”. This logical structure of the assembly up resembled a lot of the Bill of Material of the product. You could even browse deeper levels, which was not the case in the traditional Bill of Material on a drawing.
Note: The concept of EBOM and MBOM was not known in most companies. People were talking about the BOM as a one-level definition of parts or subassemblies in the assembly. See my Where is the MBOM? Post from July 2008 when this topic was still under discussion.
- The third point that would have a more significant impact later is that parts and assemblies could be reused in other products. This introduced the complexity of configuration management. For example, a 3D CAD part or assembly file could contain several configurations where only one configuration would be valid for the given product. Managing this in the 3D CAD system lead to higher productivity of the designer, however downstream when it came to data management with PDM systems, it became a nightmare.
I experienced these issues a lot when discussing with companies and implementers, mainly the implementation of SmarTeam combined with SolidWorks and Inventor. Where to manage the configuration constraints? In the PDM system or inside the 3D CAD system.
These environments were not friends (image above), and even if they came from the same vendor, it felt like discussing with tribes.
The third point also covered another topic. So far, CAD had been the first step for the detailed design of a product. However, companies now had an existing Bill of Material in the system thanks to the PDM systems. It could be a Bill of Material of a sub-assembly that is used in many other products.
Configuring a product no longer started from CAD; it started from a Product or Bill of Material structure. Sales and Engineers identified the changes needed on the BoM, keeping as much as possible released information untouched. This led to a new best practice.
The item-centric approach
Around 2005, five years after introducing the term Product Lifecycle Management, slowly, a new approach became the standard. Product Lifecycle Management was initially introduced to connect engineering and manufacturing, driven by the automotive and aerospace industry.
It was with PLM that concepts as EBOM and MBOM became visible.
In particular, the EBOM was closely linked to engineering practices, i.e., modularity and reuse. The EBOM and its related information represented the product as it was specified. It is essential to realize that the parts in the EBOM could be generic specified purchase parts to be resolved when producing the product or that the EBOM contained Make-parts specified by drawings.
At that time, the EBOM was often used as the foundation for the ERP system – see image above. The BOM was restructured and organized according to the manufacturing process specifying materials and resources needed in the ERP system. Therefore, although it was an item-like structure, this BOM (the MBOM) always had a close relation to the Bill of Process.
For companies with a single manufacturing site, the notion of EBOM and MBOM was not that big, as the ERP system would be the source of the MBOM. However, the complexity came when companies have several manufacturing sites. That was when a generic MBOM in the PLM system made more sense to centralize all product information in a single system.
The EBOM-MBOM approach has become more and more a standard practice since 2010. As a result, even small and medium-sized enterprises realized a need to manage the EBOM and the MBOM.
There were two disadvantages introduced with this EBOM-MBOM approach.
- First, the EBOM and the MBOM as information structures require a lot of administrative maintenance if information needs to be always correct (and that is the CM target). Some try to simplify this by keeping the EBOM part the same as the MBOM part, meaning the EBOM specification already targets a single supplier or manufacturer.
- The second disadvantage of making every item in the BOM behave like a part creates inefficiencies in modern environments. Products are a mix of hardware(parts) and software(models/behavior). This BOM-centric view does not provide the proper infrastructure for a data-driven approach as part specifications are still done in drawings. We need 3D annotated models related to all kinds of other behavior and physical models to specify a product that contains hard-and software.
A new paradigm is needed to manage this mix efficiently, the enabling foundation for Industry 4.0 and efficient Digital Twins; there is a need for a model-based approach based on connected data elements.
More next week.
Conclusion
The age of paper drawings | 1960 – now dead |
The age of electronic drawings | 1982 – potentially dead in 2030 |
The mainstream 3D CAD | 1995 – to be evolving through MBD and MBSE to the future – not dead shortly |
Item-centric approach | 2005 – to be evolving to a connected model-based approach – not dead shortly |
Last Friday, we discussed with several members of the PLM Global Green Alliance the book: “How to avoid a Climate Disaster” written by Bill Gates. I was happy to moderate the session between Klaus Brettschneider, Rich McFall, Lionel Grealou, Ilan Madjar and Patrick Hillberg. From the LinkedIn profiles of each of them, you can see we are all active in the domain of PLM. And they have read the book upfront before the discussion.
I think the book addresses climate change in a tangible manner. Bill Gates brings structure into addressing climate changes and encourages you to be active. What you can do as an individual, as a citizen. My only comment to this book would be that as a typical nerd, Bill Gates does not understand so much human behavior, understanding people’s emotions that might lead to non-logical behavior.
When you browse through the book’s reviews, for example, on Goodreads, you see the extreme, rating from 1 to 5. Some people believe that Bill Gates, due to his wealth and ways of living, is not allowed to write this book. Other like the transparent and pragmatic approach discussing the related themes in the book.
Our perspective
Klaus, Rich, Lio, Ilan and Patrick did not have extreme points of view – so don’t watch the recording if you are looking for anxiety. They reviewed How to Avoid a Climate Disaster from their perspective and how it could be relevant for PLM practitioners. It became a well-balanced dialogue. You can watch or listen to the recording following this link:
Book discussion: How to avoid a climate disaster written by Bill Gates
Note: we will consolidate all content on our PLMGreenAlliances website to ensure nothing is lost – feel free to comment/discuss further.
More on sustainability
If you want to learn more about all sorts of disruption, not only disruption caused by climate change, have a look at the upcoming conference this week: DISRUPTION—the PLM Professionals’ Exploration of Emerging Technologies that Will Reshape the PLM Value Equation.
My contribution will be on day 2, where I combine disruptive technology with the need to become really sustainable in our businesses.
It will be a call for action from our PLM community. In the coming nine years, we have to change our business, become sustainable and use the relevant new technologies. This requires system thinking – will mankind being able to deal with so many different parameters.
Conclusion
Start the dialogue with us, the PLM Global Green Alliance, by watching and reading content from the website. Or become an active member participating in discussion sessions related to any relevant topic for our alliance. More to come at the end of May, you too?
Last summer, I wrote a series of blog posts grouped by the theme “Learning from the past to understand the future”. These posts took you through the early days of drawings and numbering practices towards what we currently consider the best practice: PLM BOM-centric backbone for product lifecycle information.
You can find an overview and links to these posts on the page Learning from the past.
If you have read these posts, or if you have gone yourself through this journey, you will realize that all steps were more or less done evolutionarily. There were no disruptions. Affordable 3D CAD systems, new internet paradigms (interactive internet), global connectivity and mobile devices all introduced new capabilities for the mainstream. As described in these posts, the new capabilities sometimes created friction with old practices. Probably the most popular topics are the whole Form-Fit-Function interpretation and the discussion related to meaningful part numbers.
What is changing?
In the last five to ten years, a lot of new technology has come into our lives. The majority of these technologies are related to dealing with data. Digital transformation in the PLM domain means moving from a file-based/document-centric approach to a data-driven approach.
A Bill of Material on the drawing has become an Excel-like table in a PLM system. However, an Excel file is still used to represent a Bill of Material in companies that have not implemented PLM.
Another example, the specification document has become a collection of individual requirements in a system. Each requirement is a data object with its own status and content. The specification becomes a report combining all valid requirement objects.
Related to CAD, the 2D drawing is no longer the deliverable as a document; the 3D CAD model with its annotated views becomes the information carrier for engineering and manufacturing.
And most important of all, traditional PLM methodologies have been based on a mechanical design and release process. Meanwhile, modern products are systems where the majority of capabilities are defined by software. Software has an entirely different configuration and lifecycle approach conflicting with a mechanical approach, which is too rigid for software.
The last two aspects, from 2D drawings to 3D Models and Mechanical products towards Systems (hardware and software), require new data management methods. In this environment, we need to learn to manage simulation models, behavior models, physics models and 3D models as connected as possible.
I wrote about these changes three years ago: Model-Based – an introduction, which led to a lot of misunderstanding (too advanced – too hypothetical).
I plan to revisit these topics in the upcoming months again to see what has changed over the past three years.
What will I discuss in the upcoming weeks?
My first focus is on participating and contributing to the upcoming PLM Roadmap & PDS spring 2021 conference. Here speakers will discuss the need for reshaping the PLM Value Equation due to new emerging technologies. A topic that contributes perfectly to the future of PLM series.
My contribution will focus on the fact that technology alone cannot disrupt the PLM domain. We also have to deal with legacy data and legacy ways of working.
Next, I will discuss with Jennifer Herron from Action Engineering the progress made in Model-Based Definition, which fits best practices for today – a better connection between engineering and manufacturing. We will also discuss why Model-Based Definition is a significant building block required for realizing the concepts of a digital enterprise, Industry 4.0 and digital twins.
Another post will focus on the difference between the digital thread and the digital thread. Yes, it looks like I am writing twice the same words. However, you will see based on its interpretation, one definition is hanging on the past, the other is targeting the future. Again here, the differentiation is crucial if the need for a maintainable Digital Twin is required.
Model-Based Systems Engineering (MBSE) in all its aspects needs to be discussed too. MBSE is crucial for defining complex products. Model-Based Systems Engineering is seen as a discipline to design products. Understanding data management related to MBSE will be the foundation for understanding data management in a Model-Based Enterprise. For example, how to deal with configuration management in the future?
Writing Learning from the past was an easy job as explaining with hindsight is so much easier if you have lived it through. I am curious and excited about the outcome of “The Future of PLM”. Writing about the future means you have digested the information coming to you, knowing that nobody has a clear blueprint for the future of PLM.
There are people and organizations are working on this topic more academically, for example read this post from Lionel Grealou related to the Place of PLM in the Digital Future. The challenge is that an academic future might be disrupted by unpredictable events, like COVID, or disruptive technologies combined with an opportunity to succeed. Therefore I believe, it will be a learning journey for all of us where we need to learn to give technology a business purpose. Business first – then technology.
No Conclusion
Normally I close my post with a conclusion. At this moment. there is no conclusion as the journey has just started. I look forward to debating and learning with practitioners in the field. Work together on methodology and concepts that work in a digital enterprise. Join me on this journey. I will start sharing my thoughts in the upcoming months
For those living in the Northern Hemisphere: This week, we had the shortest day, or if you like the dark, the longest night. This period has always been a moment of reflection. What have we done this year?
Rob Ferrone (Quick Release), the Santa on the left (the leftist), and Jos Voskuil (TacIT), the Santa on the right (the rightist), share in a dialogue their highlights from 2020
Wishing you all a great moment of reflection and a smooth path into a Corona-proof future.
It will be different; let’s make it better.
I am still digesting all the content of the latest PLM Roadmap / PDT Fall 2020 conference and the new reality that starts to appear due to COVID-19. There is one common theme:
The importance of a resilient and digital supply chain.
Most PLM implementations focus on aligning disciplines internally; the supply chain’s involvement has always been the next step. Perhaps now it is time to make it the first step? Let’s analyze.
No Time to Market improvement due to disconnected supply chains?
During the virtual fireplace chat at the PLM Roadmap/PDT conference, just as a small bonus. You can read the full story here – the quote:
Marc mentioned a survey Gartner has done with companies in fast-moving industries related to the benefits of PLM. Companies reported improvements in accuracy of product data and product development. They did not see so much a reduced time to market or reduced product development costs. After analysis, Gartner believes the real issue is related to collaboration processes and supply chain practices. Here lead times did not change, nor the number of changes.
Of course, he spoke about fast-moving industries where the interaction was done in a disconnected manner. Gartner believes that the cloud would, for sure, start creating these benefits of a reduced time to market and cost of change when the supply chain is connected.
Therefore I want to point again to an old McKinsey article named The case for Digital Reinvention, published in February 2017. Here the authors looked at the various areas of investment in digital technologies and their ROI. See the image on the left for the areas investigated and the percentage of companies that invested in these areas at that time.
In the article, you will see the ROI analysis for these areas. For example, the marketing and distribution investments did not necessarily have a positive ROI when disconnected from other improvement areas. Digital supply chains were mentioned as the area with the potential highest ROI. However, another important message in the article for all these areas is: You need to have a complete digitization strategy. This is a point I fail to see in many companies. Often an area gets all the attention, however as it remains disconnected from the rest, the real efficiencies are not there. The McKinsey article ends with the conclusion that the digital winners at that time are the ones with bold strategies win:
we found a mismatch between today’s digital investments and the dimensions in which digitization is most significantly affecting revenue and profit growth. We also confirmed that winners invest more and more broadly and boldly than other companies do
The “connected” supply chain
Of course, the traditional industries that invented PLM have invested in a kind of connected supply chain. However, is it really a connected supply chain? Aerospace and Defense companies had their supplier portals.
A supplier had to download their information or upload their designs combined with additional metadata.
These portals were completely bespoke and required on both sides “backbreaking” manual work to create, deliver, and validate the required exchange packages. The OEMs were driving the exchange process. More or less, by this custom approach, they made it difficult for suppliers to have their own PLM-environment. The downside of this approach was that the supplier had separate environments for each OEM.
In 2006 I worked with SmarTeam on the concept of the “Supply Chain Express,” an offering that allowed a supplier to have their own environment using SmarTeam as a PDM/PLM-system the Supply Chain Express package to create an intelligent import and export package. The content was all based on files and configurable metadata based on the OEM-Supplier relation.
Some other PLM-vendors or implementers have built similar exchange solutions to connect the world of the OEM and the supplier.
The main characteristic was that it is file-based with custom metadata, often in an XML-format or otherwise using Excel as the metadata carrier.
In my terminology of Coordinated – Connected, this would be Coordinated and “old school.”
The “better connected” supply chain
As I mentioned in my previous post about the PLM Roadmap/PDT Fall conference, Katheryn Bell (Pratt & Whitney Canada) presented the progress of the A&D Global Collaboration workgroup. As part of the activities, they classified the collaboration between the OEM and the supplier in 3 levels, as you can see from the image:
This post mainly focuses on the L1 collaboration as this is probably the most used scenario.
In the Aerospace and Automotive industry, the OEM and suppliers’ data exchange has improved twofold by using Technical Data Packages where the content is supported by Model-Based Definition.
The first advantages of Model-Based Definition are mainly related to a consistent information package where the model is leading. The manufacturing views are explicitly defined on the 3D Model. Therefore there is a reduced chance of error for a misconnect between the “drawings” and the 3D Model.
The Model-Based definition still does not solve working with the latest (approved) version of the information. This still remains a “human-based” process in this case, and Kathryn Bell confirmed this was the biggest problem to solve.
The second advantage of using one of the interoperability standards for Model-Based Definition is the disconnect between application-specific data on the OEM side and the supplier side.
A significant advantage of Model-Based Definition is that there are a few interoperability standards, i.e., ISO 10303 – STEP, ISO14306 – JT, and ISO32000/14739 (PRC for 3D PDF). In the end, the ideal would be that these standards merge into one standard, completely vendor-independent with a clearly defined scope of its purpose.
The benefit of these standards is also they increase the longevity of product data as the information is stored in an application-independent format. As long as the standard does not change (fast), storing data even internally in these neutral formats can save upgrade or maintenance costs.
However, I think you all know the joke below.
The connected supply chain
The ultimate goal in the long term will be the connected supply chain. Information shared between an OEM, and a supplier does not require human-based interfaces to ensure everyone works with the correct data.
The easiest way, and this is what some of the larger OEMs have done, is to consider suppliers as part of your PLM-infrastructure and give them access to all relevant data in the context of the system, the product, or the part they are responsible for. For the OEM, the challenge will be to connect suppliers – to motivate and train them to work in this environment.
For the supplier, the challenge is their IP-management. If they work for 100 percent in the OEM-environment, everything is exposed. If they want to work in their own environment, there is probably double work and a disconnect.
Of course, everything depends on the complexity of your interaction with the supplier.
With its Fusion Cloud Product Lifecycle Management (PLM), Oracle was one of the first to shift the attention to the connected supply chain.
If you search for PLM on the Oracle website, you will find it under Fusion Supply Chain and Manufacturing. It is a logical step as traditional ERP-vendors have never provided a full, rich portfolio for product design. CAD-integrations do not get a focus, and the future path to Model-Bases approaches (MBSE / MBD /MBE) is not visible at all.
Almost similar to what the Siemens-SAP alliance is showing. SAP more or less confirms that you should not rely on SAP PLM for more advanced PLM-scenarios but on Siemens’s offering.
For less complex but fast-moving products, for example, in the apparel industry, you see the promise of connecting all suppliers in one environment is time to market and traceability. This industry does not suffer from products with a long lifecycle with upgrades and services.
So far, the best collaboration platform in the cloud I have seen in Shareaspace from Eurostep. Its foundation based on the PLCS standard allows an OEM and Supplier to connect through their “shared space” – you can look at their supply chain offering here.
In the various PDT-conferences, we have seen how even two OEMs could work in a joined environment (Renault-Nissan-Daimler) or how BAE Systems used the ShareAspace environment to collaborate and consolidate all the data coming from the various system suppliers into one standards-based environment.
In 2021, I plan to write a series of blog posts related to possible add-on services for PLM. Supplier collaboration platforms, Configuration Management, End-to-end configurators, Product Information Management, are some of the themes I am currently exploring.
Conclusion
COVID-19 has illustrated the volatility of supply chains. Changing suppliers, working with suppliers in the traditional ways, still hinder reducing time to market. However, the promise of a real connected supply chain is enormous. As Boeing demonstrated in my previous post and explained in this post, standards are needed to become future proof.
Will 2021 have more focus on the connected supply chain?
Last week I shared my first review of the PLM Roadmap / PDT Fall 2020 conference, organized by CIMdata and Eurostep. Having digested now most of the content in detail, I can state this was the best conference of 2020. In my first post, the topics I shared were mainly the consultant’s view of digital thread and digital twin concepts.
This time, I want to focus on the content presented by the various Aerospace & Defense working groups who shared their findings, lessons-learned (so far) on topics like the Multi-view BOM, Supply Chain Collaboration, MBSE Data interoperability.
These sessions were nicely wrapped with presentations from Alberto Ferrari (Raytheon), discussing the digital thread between PLM and Simulation Lifecycle Management and Jeff Plant (Boeing) sharing their Model-Based Engineering strategy.
I believe these insights are crucial, although there might be people in the field that will question if this research is essential. Is not there an easier way to achieve to have the same results?
Nicely formulated by Ilan Madjar as a comment to my first post:
Ilan makes a good point about simplifying the ideas to the masses to make it work. The majority of companies probably do not have the bandwidth to invest and understand the future benefits of a digital thread or digital twins.
This does not mean that these topics should not be studied. If your business is in a small, simple eco-system and wants to work in a connected mode, you can choose a vendor and a few custom interfaces.
However, suppose you work in a global industry with an extensive network of partners, suppliers, and customers.
In that case, you cannot rely on ad-hoc interfaces or a single vendor. You need to invest in standards; you need to study common best practices to drive methodology, standards, and vendors to align.
This process of standardization is so crucial if you want to have a sustainable, connected enterprise. In the end, the push from these companies will lead to standards, allowing the smaller companies to ad-here or connect to.
The future is about Connected through Standards, as discussed in part 1 and further in this post. Let’s go!
Global Collaboration – Defining a baseline for data exchange processes and standards
Katheryn Bell (Pratt & Whitney Canada) presented the progress of the A&D Global Collaboration workgroup. As you can see from the project timeline, they have reached the phase to look towards the future.
Katheryn mentioned the need to standardize terminology as the first point of attention. I am fully aligned with that point; without a standardized terminology framework, people will have a misunderstanding in communication.
This happens even more in the smaller businesses that just pick sometimes (buzz) terms without a full understanding.
Several years ago, I talked with a PLM-implementer telling me that their implementation focus was on systems engineering. After some more explanations, it appeared they were making an attempt for configuration management in reality. Here the confusion was massive. Still, a standard, common terminology is crucial in our domain, even if it seems academic.
The group has been analyzing interoperability standards, standards for long-time archival and retrieval (LOTAR), but also has been studying the ISO 44001 standard related to Collaborative business relationship management systems
In the Q&A session, Katheryn explained that the biggest problem to solve with collaboration was the risk of working with the wrong version of data between disciplines and suppliers.
Of course, such errors can lead to huge costs if they are discovered late (or too late). As some of the big OEMs work with thousands of suppliers, you can imagine it is not an issue easily discovered in a more ad-hoc environment.
The move to a standardized Technical Data Package based on a Model-Based Definition is one of these initiatives in this domain to reduce these types of errors.
You can find the proceedings from the Global Collaboration working group here.
Connect, Trace, and Manage Lifecycle of Models, Simulation and Linked Data: Is That Easy?
I loved Alberto Ferrari‘s (Raytheon) presentation how he described the value of a model-based digital thread, positioning it in a targeted enterprise.
Click on the image and discover how business objectives, processes and models go together supported by a federated infrastructure.
Alberto’s presentation was a kind of mind map from how I imagine the future, and it is a pity if you have not had the chance to see his session.
Alberto also focused on the importance of various simulation capabilities combined with simulation lifecycle management. For Alberto, they are essential to implement digital twins. Besides focusing on standards, Alberto pleas for a semantic integration, open service architecture with the importance of DevSecOps.
Enough food for thought; as Alberto mentioned, he presented the corporate vision, not the current state.
More A&D Action Groups
There were two more interesting specialized sessions where teams from the A&D action groups provided a status update.
Brandon Sapp (Boeing) and Ian Parent (Pratt & Whitney) shared the activities and progress on Minimum Model-Based Definition (MBD) for Type Design Certification.
As Brandon mentioned, MBD is already a widely used capability; however, MBD is still maturing and evolving. I believe that is also one of the reasons why MBD is not yet accepted in mainstream PLM. Smaller organizations will wait; however, can your company afford to wait?
More information about their progress can be found here.
Mark Williams (Boeing) reported from the A&D Model-Based Systems Engineering action group their first findings related to MBSE Data Interoperability, focusing on an Architecture Model Exchange Solution. A topic interesting to follow as the promise of MBSE is that it is about connected information shared in models. As Mark explained, data exchange standards for requirements and behavior models are mature, readily available in the tools, and easily adopted. Exchanging architecture models has proven to be very difficult. I will not dive into more details, respecting the audience of this blog.
For those interested in their progress, more information can be found here
Model-Based Engineering @ Boeing
In this conference, the participation of Boeing was significant through the various action groups. As the cherry on the cake, there was Jeff Plant‘s session, giving an overview of what is happening at Boeing. Jeff is Boeing’s director of engineering practices, processes, and tools.
In his introduction, Jeff mentioned that Boeing has more than 160.000 employees in over 65 countries. They are working with more than 12.000 suppliers globally. These suppliers can be manufacturing, service or technology partnerships. Therefore you can imagine, and as discussed by others during the conference, streamlined collaboration and traceability are crucial.
The now-famous MBE Diamond symbol illustrates the model-based information flows in the virtual world and the physical world based on the systems engineering approach. Like Katheryn Bell did in her session related to Global Collaboration, Jeff started explaining the importance of a common language and taxonomy needed if you want to standardize processes.
Zoom in on the Boeing MBE Taxonomy, you will discover the clarity it brings for the company.
I was not aware of the ISO 23247 standard concerning the Digital Twin framework for manufacturing, aiming to apply industry standards to the model-based definition of products and process planning. A standard certainly to follow as it brings standardization on top of existing standards.
As Jeff noted: A practical standard for implementation in a company of any size. In my opinion, mandatory for a sustainable, connected infrastructure.
Jeff presented the slide below, showing their standardization internally around federated platforms.
This slide resembles a lot the future platform vision I have been sharing since 2017 when discussing PLM’s future at PLM conferences, when explaining the differences between Coordinated and Connected – see also my presentation here on Slideshare.
You can zoom in on the picture to see the similarities. For me, the differences were interesting to observe. In Jeff’s diagram, the product lifecycle at the top indicates the platform of (central) interest during each lifecycle stage, suggesting a linear process again.
In reality, the flow of information through feedback loops will be there too.
The second exciting detail is that these federated architectures should be based on strong interoperability standards. Jeff is urging other companies, academics and vendors to invest and come to industry standards for Model-Based System Engineering practices. The time is now to act on this domain.
It reminded me again of Marc Halpern’s message mentioned in my previous post (part 1) that we should be worried about vendor alliances offering an integrated end-to-end data flow based on their solutions. This would lead to an immense vendor-lock in if these interfaces are not based on strong industry standards.
Therefore, don’t watch from the sideline; it is the voice (and effort) of the companies that can drive standards.
Finally, during the Q&A part, Jeff made an interesting point explaining Boeing is making a serious investment, as you can see from their participation in all the action groups. They have made the long-term business case.
The team is confident that the business case for such an investment is firm and stable, however in such long-term investment without direct results, these projects might come under pressure when the business is under pressure.
The virtual fireside chat
The conference ended with a virtual fireside chat from which I picked up an interesting point that Marc Halpern was bringing in. Marc mentioned a survey Gartner has done with companies in fast-moving industries related to the benefits of PLM. Companies reported improvements in accuracy and product development. They did not see so much a reduced time to market or cost reduction. After analysis, Gartner believes the real issue is related to collaboration processes and supply chain practices. Here lead times did not change, nor the number of changes.
Marc believes that this topic will be really showing benefits in the future with cloud and connected suppliers. This reminded me of an article published by McKinsey called The case for digital reinvention. In this article, the authors indicated that only 2 % of the companies interview were investing in a digital supply chain. At the same time, the expected benefits in this area would have the most significant ROI.
The good news, there is consistency, and we know where to focus for early results.
Conclusion
It was a great conference as here we could see digital transformation in action (groups). Where vendor solutions often provide a sneaky preview of the future, we saw people working on creating the right foundations based on standards. My appreciation goes to all the active members in the CIMdata A&D action groups as they provide the groundwork for all of us – sooner or later.
About a year ago we started the PLM Global Green Alliance, further abbreviated as the PGGA. Rich McFall, the main driver behind the PGGA started the website, The PLM Green Alliance, to have a persistent place to share information.
Also, we launched the PLM Global Alliance LinkedIn group to share our intentions and create a community of people who would like to share knowledge through information or discussion.
Our mission statement is:
The mission of the new PLM Green Alliance is to create global connection, communication, and community between professionals who use, develop, market, or support Product Lifecycle Management (PLM) related technologies and software solutions that have value in addressing the causes and consequences of climate change due to human-generated greenhouse gas emissions. We are motivated by the technological challenge to help create a more sustainable and green future for our economies, industries, communities, and all life forms on our planet that depend on healthy ecosystems.
My motivation
My personal motivation to support and join the PGGA was driven by the wish to combine my PLM-world with interest to create a more sustainable society for anyone around the world. It is a challenging combination. For example, PLM is born in the Aerospace and Defense industries, probably not the most sustainable industries.
Having worked with some companies in the Apparel and Retail industry, I have seen that these industries care more about their carbon footprint. Perhaps because they are “volume-industries” closely connected to their consumers, these industries actively build practices to reduce their carbon footprint and impact societies. The sense or non-sense of recycling is such a topic to discuss and analyze.
At that time, I got inspired by a session during the PLM Roadmap / PDT 2019 conference.
Graham Aid‘s from the Ragn-Sells group was a call to action. Sustainability and a wealthy economy go together; however, we have to change our habits & think patterns. You can read my review from this session in this blog post: The weekend after PLM Roadmap / PDT 2019 – Day 1
Many readers of this post have probably never heard of the Ragn-Sells group or followed up on a call for action. I have the same challenge. Being motivated beyond your day-to-day business (the old ways of working) and giving these activities priority above exploring and learning more about applying sustainability in my PLM practices.
And then came COVID-19.
I think most of you have seen the image on the left, which started as a joke. However, looking back, we all have seen that COVID-19 has led to a tremendous push for using digital technologies to modernize existing businesses.
Personally, I was used to traveling every 2 – 3 weeks to a customer, now I have left my home office only twice for business. Meanwhile, I invested in better communication equipment and a place to work. And hé, it remains possible to work and communicate with people.
Onboarding new people, getting to know new people takes more social interaction than a camera can bring.
In the PGGA LinkedIn community, we had people joining from all over the world. We started to organize video meetings to discuss their expectations and interest in this group with some active members.
We learned several things from these calls.
First of all, finding a single timeslot that everyone worldwide could participate in is a challenge. A late Friday afternoon is almost midnight in Asia and morning in the US. And is Friday the best day – we do not know yet.
Secondly, we realized that posts published in our LinkedIn group did not appear in everyone’s LinkedIn feed due to LinkedIn’s algorithms. For professionals, LinkedIn becomes less and less attractive as the algorithms seem to prefer frequency/spam above content.
For that reason, we are probably moving to the PLM Green Alliance website and combine this environment with a space for discussion outside the LinkedIn scope. More to come on the PGGA website.
Finally, we will organize video discussion sessions to ask the participants to prepare themselves for a discussion. Any member of the PGGA can bring in the discussion topics.
It might be a topic you want to clarify or better understand.
What’s next
For December 4th, we have planned a discussion meeting related to the Exponential Roadmap 2019 report, where 36 solutions to halve carbon emission by 2030 are discussed. In our video discussion, we want to focus on the chapter: Digital Industries.
We believe that this topic comes closest to our PLM domain and hopes that participants will share their thinking and potential activities within their companies.
You can download the Exponential Roadmap here or by clicking on the image. More details about the PLM Global Green Alliance you will find in the LinkedIn group. If you want to participate, let us know.
The PGGA website will be the place where more and more information will be collected per theme, to help you understand what is happening worldwide and the place where you can contribute to let us know what is happening at your side.
Conclusion
The PLM Global Green Alliance exists now for a year with 192 members. With approximately five percent active members, we have the motivation to grow our efforts and value. We learned from COVID-19 there is a need to become proactive as the costs of prevention are always lower than the costs of (trying) fixing afterward.
And each of us has the challenge to behave a little differently than before.
Will you be one of them ?
In the last two weeks, three events were leading to this post.
First, I read John Stark’s recent book Products2019. A must-read for anyone who wants to understand the full reach of product lifecycle related activities. See my recent post: Products2019, a must-read if you are new to PLM
Afterwards, I talked with John, discussing the lack of knowledge and teaching of PLM, not to be confused by PLM capabilities and features.
Second, I participated in an exciting PI DX USA 2020 event. Some of the sessions and most of the roundtables provided insights to me and, hopefully, many other participants. You can get an impression in the post: The Weekend after PI DX 2020 USA.
A small disappointment in that event was the closing session with six vendors, as I wrote. I know it is evident when you put a group of vendors in the arena, it will be about scoring points instead of finding alignment. Still, having criticism does not mean blaming, and I am always open to having a dialogue. For that reason, I am grateful for their sponsorship and contribution.
Oleg Shilovitsky mentioned cleverly that this statement is a contradiction.
“How can you accuse PLM vendors of having a limited view on PLM and thanking them for their contribution?”
I hope the above explanation says it all, combined with the fact that I grew up in a Dutch culture of not hiding friction, meanwhile being respectful to others.
We cannot simplify PLM by just a better tool or technology or by 3D for everybody. There are so many more people and processes related to product lifecycle management involved in this domain if you want a real conference, however many of them will not sponsor events.
It is well illustrated in John Stark’s book. Many disciplines are involved in the product lifecycle. Therefore, if you only focus on what you can do with your tool, it will lead to an incomplete understanding.
If your tool is a hammer, you hope to see nails everywhere around you to demonstrate your value
The thirds event was a LinkedIn post from John Stark – 16 groups needing Product Lifecycle Knowledge, which for me was a logical follow-up on the previous two events. I promised John to go through these 16 groups and provide my thoughts.
Please read his post first as I will not rewrite what has been said by John already.
CEOs and CTOs
John suggested that they should read his book, which might take more than eight hours. CEOs and CTOs, most of the time, do not read this type of book with so many details, so probably mission impossible.
They want to keep up with the significant trends and need to think about future business (model).
New digital and technical capabilities allow companies to move from a linear, coordinated business towards a resilient, connected business. This requires exploring future business models and working methods by experimenting in real-life, not Proof of Concept. Creating a learning culture and allowing experiments to fail is crucial, as you only learn by failing.
CDO, CIOs and Digital Transformation Executives
They are the crucial people to help the business to imagine what digital technologies can do. They should educate the board and the business teams about the power of having reliable, real-time data available for everyone connected. Instead of standardizing on systems and optimizing the siloes, they should assist and lead in new infrastructure for connected services, end-to-end flows delivered on connected platforms.
These concepts won’t be realized soon. However, doing nothing is a big risk, as the traditional business will decline in a competitive environment. Time to act.
Departmental Managers
These are the people that should worry about their job in the long term. Their current mission might be to optimize their department within its own Profit & Loss budget. The future is about optimizing the information flow for the whole value chain, including suppliers and customers.
I wrote about it in “The Middle Management Dilemma.” Departmental Managers should become more team leaders inspiring and supporting the team members instead of controlling the numbers.
Products Managers
This is a crucial role for the future, assuming a product manager is not only responsible for the marketing or development side of the product but also gets responsibility for understanding what happens with the product during production and sales performance. Understanding the full lifecycle performance and cost should be their mission, supported by a digital infrastructure.
Product Developers
They should read the book Products2019 to be aware there is so much related to their work. From this understanding, a product developer should ask the question:
“What can I do better to serve my internal and external customers ?”
This question will no arise in a hierarchical organization where people are controlled by managers that have a mission to optimize their silo. Product Developers should be trained and coached to operate in a broader context, which should be part of your company’s mission. Too many people complain about usability in their authoring and data management systems without having a holistic understanding of why you need change processes and configuration management.
Product Lifecycle Management (PLM) deployers
Here I have a little bit of the challenge that this might be read as PLM-system users. However, it should be clear that we mean here people using product data at any moment along the product lifecycle, not necessarily in a single system.
This is again related to your company’s management culture. In the ideal world, people work with a purpose and get informed on how their contribution fits the company’s strategy and execution.
Unfortunately, in most hierarchical organizations, the strategy and total overview get lost, and people become measured resources.
New Hires and others
John continues with five other groups within the organization. I will not comment on them, as the answers are similar to the ones above – it is about organization and culture.
Educators and Students
This topic is very close to my heart, and one of the reasons I continue blogging about PLM practices. There is not enough attention to product development methodology or processes. Engineers can get many years of education in specific domains, like product design principles, available tools and technologies, performing physical and logical simulations.
Not so much time is spent on educating current best practices, business models for product lifecycle management.
Check in your country how many vendor-independent methodology-oriented training you can find. Perhaps the only consistent organization I know is CIMdata, where the challenge is that they deliver training to companies after students have graduated. It would be great if education institutes would embed serious time for product lifecycle management topics in their curriculum. The challenge, of course, the time and budget needed to create materials and, coming next, prioritizing this topic on the overall agenda.
I am happy to participate to a Specialized Master education program aiming at the Products and Buildings Digital Engineering Manager (INGENUM). This program organized by Arts Et Metiers in France helps create the overview for understanding PLM and BIM – in the French language as before COVID-19 this was an on-site training course in Paris.
Hopefully, there are more institutes offering PLM eductation – feel free to add them in the comments of this post.
Consultants, Integrators and Software Company Employees
Of course, it would be nice if everyone in these groups understands the total flow and processes within an organization and how they relate to each other. Too often, I have seen experts in a specific domain, for example, a 3D CAD-system having no clue about revisioning, the relation of CAD to the BOM, or the fundamentals of configuration management.
Consultants, Integrators and Software Company Employees have their own challenges as their business model is often looking for specialized skills they can sell to their clients, where a broader and general knowledge will come from experience on-the-job.
And if you are three years working full-time on a single project or perhaps work in three projects, your broader knowledge does not grow fast. You might become the hammer that sees nails everywhere.
For that reason, I recommend everyone in my ecosystem to invest your personal time to read related topics of interest. Read LinkedIn-posts from others and learn to differentiate between marketing messages and people willing to share experiences. Don’t waste your time on the marketing messages and react and participate in the other discussions. A “Like” is not enough. Ask questions or add your insights.
In the context of my personal learning, I mentioned that I participated in the DigitalTwin-conference in the Netherlands this week. Unfortunately, due to the partial lockdown, mainly a virtual event.
I got several new insights that I will share with you soon. An event that illustrated Digital Twin as a buzzword might be hype, however several of the participants illustrated examples of where they applied or plan to apply Digital Twin concepts. A great touch with reality.
Another upcoming conference that will start next week in the PLM Roadmap 2020 – PDT conference. The theme: Digital Thread—the PLM Professionals’ Path to Delivering Innovation, Efficiency, and Quality is not a marketing theme as you can learn from the agenda. Step by step we are learning here from each other.
Conclusion
John Stark started with the question of who should need Product Lifecycle Knowledge. In general, Knowledge is power, and it does not come for free. Either by consultancy, reading or training. Related to Product Lifecycle Management, everyone must understand the bigger picture. For executives as they will need to steer the company in the right direction. For everyone else to streamline the company and enjoy working in a profitable environment where you contribute and can even inspire others.
An organization is like a human body; you cannot have individual cells or organs that optimize themselves only – we have a name for that disease. Want to learn more? Read this poem: Who should be the boss?
After the series about “Learning from the past,” it is time to start looking toward the future. I learned from several discussions that I probably work most of the time with advanced companies. I believe this would motivate companies that lag behind even to look into the future even more.
If you look into the future for your company, you need new or better business outcomes. That should be the driver for your company. A company does not need PLM or a Digital Twin. A company might want to reduce its time to market and improve collaboration between all stakeholders. These objectives can be realized by different ways of working and an IT infrastructure to allow these processes to become digital and connected.
That is the “game”. Coming back to the future of PLM. We do not need a discussion about definitions; I leave this to the academics and vendors. We will see the same applies to the concept of a Digital Twin.
My statement: The digital twin is not new. Everybody can have their own digital twin as long as you interpret the definition differently. Does this sound like the PLM definition?
The definition
I like to follow the Gartner definition:
A digital twin is a digital representation of a real-world entity or system. The implementation of a digital twin is an encapsulated software object or model that mirrors a unique physical object, process, organization, person, or other abstraction. Data from multiple digital twins can be aggregated for a composite view across a number of real-world entities, such as a power plant or a city, and their related processes.
As you see, not a narrow definition. Now we will look at the different types of interpretations.
Single-purpose siloed Digital Twins
- Simple – data only
One of the most straightforward applications of a digital twin is, for example, my Garmin Connect environment. My device registers performance parameters (speed, cadence, power, heartbeat, location) when cycling. Then, after every trip, I can analyze my performance. I can see changes in my overall performance; compare my performance with others in my category (weight, age, sex).
Based on that, I can decide if I want to improve my performance. My personal business goal is to maintain and improve my overall performance, knowing I cannot stop aging by upgrading my body.
On November 4th, 2020, I am participating in the (almost virtual) Digital Twin conference organized by Bits&Chips in the Netherlands. In the context of human performance, I look forward to Natal van Riel’s presentation: Towards the metabolic digital twin – for sure, this direction is not simple. Natal is a full professor at the Technical University in Eindhoven, the “smart city” in the Netherlands.
- Medium – data and operating models
Many connected devices in the world use the same principle. An airplane engine, an industrial robot, a wind turbine, a medical device, and a train carriage; all track the performance based on this connection between physical and virtual, based on some sort of digital connectivity.
The business case here is also monitoring performance, predicting maintenance, and upgrading the product when needed.
This is the domain of Asset Lifecycle Management, a practice that has existed for decades. Based on financial and performance models, the optimal balance between maintaining and overhauling has to be found. Repairs are disruptive and can be extremely costly. A manufacturing site that cannot produce can cost millions per day. Connecting data between the physical and the virtual model allows us to have real-time insights and be proactive. It becomes a digital twin.
- Advanced – data and connected 3D model
The digital twin we see the most in marketing videos is a virtual twin, using a 3D representation for understanding and navigation. The 3D representation provides a Virtual Reality (VR) environment with connected data. When pointing at the virtual components, information might appear, or some animation might take place.
Building such a virtual representation is a significant effort; therefore, there needs to be a serious business case.
The simplest business case is to use the virtual twin for training purposes. A flight simulator provides a virtual environment and behavior as-if you are flying in a physical airplane – the behavior model behind the simulator should match as well as possibly the real behavior. However, as it is a model, it will never be 100 % reality and requires updates when new findings or product changes appear.
A virtual model of a platform or plant can be used for training on Standard Operating Procedures (SOPs). In the physical world, there is no place or time to conduct such training. Here the complexity might be lower. There is a 3D Model; however, serious updates can only be expected after a major maintenance or overhaul activity.
These practices are not new either and are used in places where physical training cannot be done.
More challenging is the Augmented Reality (AR) use case. Here the virtual model, most of the time, a lightweight 3D Model, connects to real-time data coming from other sources. For example, AR can be used when an engineer has to service a machine. The AR environment might project actual data from the machine, indicate service points and service procedures.
The positive side of the business case is clear for such an opportunity, ensuring service engineers always work with the right information in a real-time context. The main obstacle to implementing AR, in reality, is the access to data, the presentation of the data and keeping the data in the AR environment matching the reality.
And although there are 3D Models in use, they are, to my knowledge, always created in siloes, not yet connected to their design sources. Have a look at the Digital Twin conference from Bits&Chips, as mentioned before.
Several of the cases mentioned above will be discussed here. The conference’s target is to share real cases concluded by Q & A sessions, crucial for a virtual event.
Connected Virtual Twins along the product lifecycle
So far, we have been discussing the virtual twin concept, where we connect a product/system/person in the physical world to a virtual model. Now let us zoom in on the virtual twins relevant for the early parts of the product lifecycle, the manufacturing twin, and the development twin. This image from Siemens illustrates the concept:
On slides they imagine a complete integrated framework, which is the future vision. Let us first zoom in on the individual connected twins.
The digital production twin
This is the area of virtual manufacturing and creating a virtual model of the manufacturing plant. Virtual manufacturing planning is not a new topic. DELMIA (Dassault Systèmes) and Tecnomatix (Siemens) are already for a long time offering virtual manufacturing planning solutions.
At that time, the business case was based on the fact that the definition of a manufacturing plant and process done virtually allows you to optimize the plant before investing in physical assets.
Saving money as there is no costly prototype phase to optimize production. In a virtual world, you can perform many trade-off studies without extra costs. That was the past (and, for many companies, still the current situation).
With the need to be more flexible in manufacturing to address individual customer orders without increasing the overhead of delivering these customer-specific solutions, there is a need for a configurable plant that can produce these individual products (batch size 1).
This is where the virtual plant model comes into the picture again. Instead of having a virtual model to define the ultimate physical plant, now the virtual model remains an active model to propose and configure the production process for each of these individual products in the physical plant.
This is partly what Industry 4.0 is about. Using a model-based approach to configure the plant and its assets in a connected manner. The digital production twin drives the execution of the physical plant. The factory has to change from a static factory to a dynamic “smart” factory.
In the domain of Industry 4.0, companies are reporting progress. However, in my experience, the main challenge is still that the product source data is not yet built in a model-based, configurable manner. Therefore, requires manual rework. This is the area of Model-Based Definition, and I have been writing about this aspect several times. Latest post: Model-Based: Connecting Engineering and Manufacturing
The business case for this type of digital twin, of course, is to be able to customer-specific products with extremely competitive speed and reduced cost compared to standard. It could be your company’s survival strategy. As it is hard to predict the future, as we see from COVID-19, it is still crucial to anticipate the future instead of waiting.
The digital development twin
Before a product gets manufactured, there is a product development process. In the past, this was pure mechanical with some electronic components. Nowadays, many companies are actually manufacturing systems as the software controlling the product plays a significant role. In this context, the model-based systems engineering approach is the upcoming approach to defining and testing a system virtually before committing to the physical world.
Model-Based Systems Engineering can define a single complex product and perform all kinds of analyses on the system even before there is a physical system in place. I will explain more about model-based systems engineering in future posts. In this context, I want to stress that having a model-based system engineering environment combined with modularity (do not confuse it with model-based) is a solid foundation for dealing with unique custom products. Solutions can be configured and validated against their requirements already during the engineering phase.
The business case for the digital development twin is easy to make. Shorter time to market, improved and validated quality, and reduced engineering hours and costs compared to traditional ways of working. To achieve these results, for sure, you need to change your ways of working and the tools you are using. So it won’t be that easy!
For those interested in Industry 4.0 and the Model-Based System Engineering approach, join me at the upcoming PLM Road Map 2020 and PDT 2020 conference on 17-18-19 November. As you can see from the agenda, a lot of attention to the Digital Twin and Model-Based approaches.
Three digital half-days with hopefully a lot to learn and stay with our feet on the ground. In particular, I am looking forward to Marc Halpern’s keynote speech: Digital Thread: Be Careful What you Wish For, It Just Might Come True
Conclusion
It has been very noisy on the internet related to product features and technologies, probably due to COVID-19 and therefore disrupted interactions between all of us – vendors, implementers and companies trying to adjust their future. The Digital Twin concept is an excellent framing for a concept that everyone can relate to. Choose your business case and then look for the best matching twin.
Jos, what a ride you have had! And looking at some of the spaghetti system architectures of even today's businesses,…
Congratulations, Jos! I'm very happy that you'll stay active in the PLM world and continue with your blogs - during…
Jos, welcome to the world of (part-time) retirement. Enjoy your AOW. Thanks Dick, you have the experience now - enjoy…
Thanks for all the valuable thoughts you have shared with us Jos, hope your 'new career' will bring you lots…
Great.. Congratulations on reaching yet another milestone... your blog is very thought proving and helps us to think in multiple…