You are currently browsing the category archive for the ‘Cloud’ category.

PLM and Complementary domains/practices

After “The PLM Doctor is IN #2,” now again a written post in the category of PLM and complementary practices/domains.

After PLM and Configuration Lifecycle ManagementCLM (January 2021) and PLM and Configuration Management CM (February 2021), now it is time to address the third interesting topic:
PLM and Supply Chain collaboration.

In this post, I am speaking with Magnus Färneland from Eurostep, a company well known in my PLM ecosystem, through their involvement in standards (STEP and PLCS), the PDT conferences, and their PLM collaboration hub, ShareAspace.

Supply Chain collaboration

The interaction between OEMs and their suppliers has been a topic of particular interest to me. As a warming-up, read my post after CIMdata/PDT Roadmap 2020:  PLM and the Supply Chain. In this post, I briefly touched on the Eurostep approach – having a Supply Chain Collaboration Hub. Below an image from that post – in this case, the Collaboration Hub is positioned between two OEMs.

Slide: PDT Europe 2016 RENAULT PLM Challenges

Recently Eurostep shared a blog post in the same context: 3 Steps to remove data silos from your supply chain addressing the dreams of many companies: moving from disconnected information silos towards a logical flow of data. This topic is well suited for all companies in the digital transformation process with their supply chain. So, let us hear it from Eurostep.

Eurostep – the company / the mission

First of all, can you give a short introduction to Eurostep as a company and the unique value you are offering to your clients?


Eurostep was founded in 1994 by several world-class experts on product data and information management. In the year 2000, we started developing ShareAspace. We took all the experience we had from working with collaboration in the extended enterprise, mixed it with our standards knowledge, and selected Microsoft as the technology for our software platform.

We now offer ShareAspace as a solution for product information collaboration in all three industry verticals where we are active: Manufacturing, Defense and AEC & Plant.

In the Manufacturing offering – the Supply Chain Collaboration Hub

ShareAspace is based on an information standard called PLCS (ISO 10300-239). This means we have a data model covering the complete life cycle of a product from requirements and conceptual design to an existing installed base. We have added things needed, such as consolidation and security. Our partnership with Microsoft has also resulted in ShareAspace being available in Azure as a service (our Design to Manufacturing software).

 

Why a supply chain collaboration hub?

Currently, most suppliers work in a disconnected manner with their clients – sending files up and down or the need to work inside the OEM environment. What are the reasons to consider a supply chain collaboration hub or, as you call it, a product information collaboration solution?

The hub concept is not new per se. There are plenty of examples of file sharing hubs. Once you realize that sending files back and forth by email is a disaster for keeping control of your information being shared with suppliers, you would probably try out one of the available file-sharing alternatives.

However, after a while, you begin to realize that a file share can be quite time consuming to keep up to date. Files are being changed. Files are being removed! Some files are enormous, and you realize that you only need a fraction of what is in the file. References within one file to another file becomes corrupt because the other file is of a new version. Etc. Etc.

This is about the time when you realize that you need similar control of the data you share with suppliers as you have in your internal systems. If not better.

A hub allows all partners to continue to use their internal tools and processes. It is also a more secure way of collaboration as the suppliers and partners are not let into the internal systems of the OEM.

Another significant side effect of this is that you only expose the data in the hub intended for external sharing and avoid sharing too much or exposing internal sensitive data.

A hub is also suitable for business flexibility as partners are not hardwired with the OEM. Partners can change, and IT systems in the value chain can change without impacting more than the single system’s connecting to the hub.

Should every company implement a supply chain collaboration hub?

Based on your experience, what types of companies should implement a supply chain collaboration hub and what are the expected benefits?

 

The large OEMs and 1st tier suppliers certainly benefit from this since they can incorporate hundreds, if not thousands, of suppliers. Sharing technical data across the supply chain from a dedicated hub will remove confusions, improve control of the shared data, and build trust with their partners.

With our cloud-based offering, we now also make it possible for at least mid-sized companies (like 200+ employees) to use ShareAspace. They may not have a well-adopted PLM system or the issues of communicating complex specifications originating from several internal sources. However still, they need to be professional in dealing with suppliers.

The smallest client we have is a manufacturer of pool cleaners, a complex product with many suppliers. The company Weda [www.weda.se] has less than 10 employees, and they use ShareAspace as SaaS. With ShareAspace, they have improved their collaboration process with suppliers and cut costs and lowered inventory levels.

ShareAspace can really scale big. It serves as a collaboration solution for the two new Aircraft carriers in the UK, the QUEEN ELIZABETH class. The aircraft carriers were built by a consortium that was closed in early 2020.

ShareAspace is being used to hold the design data and other documentation from the consortium to be available to the multiple organizations (both inside and outside of the Ministry of Defence) that need controlled access.

 

What is the dependency on standards?

I always associate Eurostep with the PLCS (ISO 10303-239) standard, providing an information model for “hardware” products along the lifecycle. How important is this standard for you in the context of your ShareAspace offering?
Should everyone adapt to this standard?

We have used PLCS to define the internal data schema in ShareAspace. This is an excellent starting point for capturing information from different systems and domains and still getting it to fit together. Why invent something new?

However, we can import data in most formats, and it does not have to be according to a standard. When connecting to Teamcenter, Windchill, Enovia, SAP, Oracle, Maximo etc., it is more often in a proprietary format than according to any standards.

Capital Facilities Information HandOver Specification (CFHOS) exchange

On the other hand, in some industries like Defense, standards-based data exchange is required and put into contracts. Sometimes it prescribes PLCS.  For the plant industry, it could be CFIHOS or ISO15926.

Supply Chain Collaboration and digital transformation

As stated at the beginning of this post, digital transformation is about connecting the information siloes through a digital thread. How important is this related to the supply chain?

Many companies have come a long way in improving their internal management of product data. But still, the exchange and sharing of data with the external world has considerable potential for improvement. Just look at the chaos everyone has experienced with emails, still used a lot, in finding the latest Word document or PowerPoint file. Imagine if you collaborate on a ship, a truck, a power plant, or a piece of complex infrastructure. FTP is not the answer, and for product data, Dropbox is not doing the trick.

A Digital Thread must support versions and changes in all directions, as changes are natural with reasonably advanced products. Much of the information created about or around a product is generated within the supply chain, like production parameters, test and inspection protocols, certifications, and more. Without an intelligent way of capturing this data, companies will continue to spend a fortune on administration trying to manage this manually.

As the Digital Thread extends across the value chain, a useful sharing tool is needed to allow for configuration management across the complete chain – ShareAspace is designed for this. The great thing with PLCS is that it gives a standard model for the Digital Thread covering several Digital Twins. PLCS adds the life cycle component, which is essential, and there is no alternative. Therefore, we are welcome with ShareAspace and PLCS to add capabilities to snapshot standards like IFC etc., that are outside the STEP series of standards.

Learning more

We discussed that a supply chain collaboration hub can have specific value to a company. Where can readers learn more?

There is a lot of information available. Of course, on our Eurostep website, you will find information under the tab Resources or on the ShareAspace website under the tab News.
Other sources are:

CIMdata A Controlled and Protected Partner and Supplier Collaboration Environment
Boston Consulting Group Share to Gain: Unlocking Data Value in Manufacturing
Eurostep Data sharing and collaboration across global value chains worth 100 Billion USD is waiting for you!
McKinsey Digital supply chains: Do you have the skills to run them?

 

What I have learned

  • I am surprised to see that the type of Supplier Collaboration Platform delivered by Eurostep is not a booming market. Where Time to Market is significantly impacted by how companies work with their suppliers, most companies still rely on the exchange of data packages.
  • The most advanced exchanges are using a model-based definition if relevant. Traditional PLM Vendors will not develop such platforms as the platform needs to be agnostic in both directions.
  • Having a recommended data model based on PLCS or a custom-data model in case of a large OEM can provide such a collaboration hub. Relative easy to implement (as you do not change your own PLM) and relatively easy to scale (adding a new supplier is easy).  For me, the supplier collaboration platform is a must in a modern, digital connected enterprise.

Conclusion

A lot of marketing money is spent on “Digital Thread” or “Digital Continuity”.  If you are looking at the full value chain of product development and operational support, there are still many manual hand-over processes with suppliers. A supplier collaboration hub might be the missing piece of the puzzle to realize a real digital thread or continuity.

I am still digesting all the content of the latest PLM Roadmap / PDT Fall 2020 conference and the new reality that starts to appear due to COVID-19. There is one common theme:

The importance of a resilient and digital supply chain.

Most PLM implementations focus on aligning disciplines internally; the supply chain’s involvement has always been the next step. Perhaps now it is time to make it the first step? Let’s analyze.

No Time to Market improvement due to disconnected supply chains?

During the virtual fireplace chat at the PLM Roadmap/PDT conference, just as a small bonus. You can read the full story here – the quote:

Marc mentioned a survey Gartner has done with companies in fast-moving industries related to the benefits of PLM. Companies reported improvements in accuracy of product data and product development. They did not see so much a reduced time to market or reduced product development costs. After analysis, Gartner believes the real issue is related to collaboration processes and supply chain practices. Here lead times did not change, nor the number of changes.

Of course, he spoke about fast-moving industries where the interaction was done in a disconnected manner. Gartner believes that the cloud would, for sure, start creating these benefits of a reduced time to market and cost of change when the supply chain is connected.

Therefore I want to point again to an old McKinsey article named The case for Digital Reinvention, published in February 2017. Here the authors looked at the various areas of investment in digital technologies and their ROI.  See the image on the left for the areas investigated and the percentage of companies that invested in these areas at that time.

In the article, you will see the ROI analysis for these areas. For example, the marketing and distribution investments did not necessarily have a positive ROI when disconnected from other improvement areas. Digital supply chains were mentioned as the area with the potential highest ROI. However, another important message in the article for all these areas is: You need to have a complete digitization strategy. This is a point I fail to see in many companies. Often an area gets all the attention, however as it remains disconnected from the rest, the real efficiencies are not there. The McKinsey article ends with the conclusion that the digital winners at that time are the ones with bold strategies win:

we found a mismatch between today’s digital investments and the dimensions in which digitization is most significantly affecting revenue and profit growth. We also confirmed that winners invest more and more broadly and boldly than other companies do

The “connected” supply chain

Image: A&D Action Group – Global Collaboration

Of course, the traditional industries that invented PLM have invested in a kind of connected supply chain. However, is it really a connected supply chain? Aerospace and Defense companies had their supplier portals.

A supplier had to download their information or upload their designs combined with additional metadata.

These portals were completely bespoke and required on both sides “backbreaking” manual work to create, deliver, and validate the required exchange packages. The OEMs were driving the exchange process. More or less, by this custom approach, they made it difficult for suppliers to have their own PLM-environment. The downside of this approach was that the supplier had separate environments for each OEM.

In 2006 I worked with SmarTeam on the concept of the “Supply Chain Express,” an offering that allowed a supplier to have their own environment using SmarTeam as a PDM/PLM-system the Supply Chain Express package to create an intelligent import and export package. The content was all based on files and configurable metadata based on the OEM-Supplier relation.

Some other PLM-vendors or implementers have built similar exchange solutions to connect the world of the OEM and the supplier.

The main characteristic was that it is file-based with custom metadata, often in an XML-format or otherwise using Excel as the metadata carrier.

In my terminology of Coordinated – Connected, this would be Coordinated and “old school.”

 

The “better connected” supply chain

As I mentioned in my previous post about the PLM Roadmap/PDT Fall conference,  Katheryn Bell (Pratt & Whitney Canada) presented the progress of the A&D Global Collaboration workgroup. As part of the activities, they classified the collaboration between the OEM and the supplier in 3 levels, as you can see from the image:

This post mainly focuses on the L1 collaboration as this is probably the most used scenario.

In the Aerospace and Automotive industry, the OEM and suppliers’ data exchange has improved twofold by using Technical Data Packages where the content is supported by Model-Based Definition.

The first advantages of Model-Based Definition are mainly related to a consistent information package where the model is leading. The manufacturing views are explicitly defined on the 3D Model. Therefore there is a reduced chance of error for a misconnect between the “drawings” and the 3D Model.

The Model-Based definition still does not solve working with the latest (approved) version of the information. This still remains a “human-based” process in this case, and Kathryn Bell confirmed this was the biggest problem to solve.

The second advantage of using one of the interoperability standards for Model-Based Definition is the disconnect between application-specific data on the OEM side and the supplier side.

A significant advantage of Model-Based Definition is that there are a few interoperability standards, i.e., ISO 10303 – STEP, ISO14306 – JT, and  ISO32000/14739 (PRC for 3D PDF). In the end, the ideal would be that these standards merge into one standard, completely vendor-independent with a clearly defined scope of its purpose.

The benefit of these standards is also they increase the longevity of product data as the information is stored in an application-independent format. As long as the standard does not change (fast), storing data even internally in these neutral formats can save upgrade or maintenance costs.

However, I think you all know the joke below.

 

The connected supply chain

The ultimate goal in the long term will be the connected supply chain. Information shared between an OEM, and a supplier does not require human-based interfaces to ensure everyone works with the correct data.

The easiest way, and this is what some of the larger OEMs have done, is to consider suppliers as part of your PLM-infrastructure and give them access to all relevant data in the context of the system, the product, or the part they are responsible for. For the OEM, the challenge will be to connect suppliers – to motivate and train them to work in this environment.

For the supplier, the challenge is their IP-management. If they work for 100 percent in the OEM-environment, everything is exposed. If they want to work in their own environment, there is probably double work and a disconnect.

Of course, everything depends on the complexity of your interaction with the supplier.

With its Fusion Cloud Product Lifecycle Management (PLM), Oracle was one of the first to shift the attention to the connected supply chain.

If you search for PLM on the Oracle website, you will find it under Fusion Supply Chain and Manufacturing. It is a logical step as traditional ERP-vendors have never provided a full, rich portfolio for product design. CAD-integrations do not get a focus, and the future path to Model-Bases approaches (MBSE / MBD /MBE) is not visible at all.

Almost similar to what the Siemens-SAP alliance is showing. SAP more or less confirms that you should not rely on SAP PLM for more advanced PLM-scenarios but on Siemens’s offering.

For less complex but fast-moving products, for example, in the apparel industry, you see the promise of connecting all suppliers in one environment is time to market and traceability. This industry does not suffer from products with a long lifecycle with upgrades and services.

So far, the best collaboration platform in the cloud I have seen in Shareaspace from Eurostep. Its foundation based on the PLCS standard allows an OEM and Supplier to connect through their “shared space” – you can look at their supply chain offering here.

Slide: PDT Europe 2016 RENAULT PLM Challenges

In the various PDT-conferences, we have seen how even two OEMs could work in a joined environment (Renault-Nissan-Daimler) or how  BAE Systems used the ShareAspace environment to collaborate and consolidate all the data coming from the various system suppliers into one standards-based environment.

In 2021, I plan to write a series of blog posts related to possible add-on services for PLM. Supplier collaboration platforms, Configuration Management, End-to-end configurators, Product Information Management, are some of the themes I am currently exploring.

Conclusion

COVID-19 has illustrated the volatility of supply chains. Changing suppliers, working with suppliers in the traditional ways, still hinder reducing time to market. However, the promise of a real connected supply chain is enormous. As Boeing demonstrated in my previous post and explained in this post, standards are needed to become future proof.

Will 2021 have more focus on the connected supply chain?

 

Last week I shared my first review of the PLM Roadmap / PDT Fall 2020 conference, organized by CIMdata and Eurostep. Having digested now most of the content in detail, I can state this was the best conference of 2020. In my first post, the topics I shared were mainly the consultant’s view of digital thread and digital twin concepts.

This time, I want to focus on the content presented by the various Aerospace & Defense working groups who shared their findings, lessons-learned (so far) on topics like the Multi-view BOM, Supply Chain Collaboration, MBSE Data interoperability.

These sessions were nicely wrapped with presentations from Alberto Ferrari (Raytheon), discussing the digital thread between PLM and Simulation Lifecycle Management and Jeff Plant (Boeing) sharing their Model-Based Engineering strategy.

I believe these insights are crucial, although there might be people in the field that will question if this research is essential. Is not there an easier way to achieve to have the same results?

Nicely formulated by Ilan Madjar as a comment to my first post:

Ilan makes a good point about simplifying the ideas to the masses to make it work. The majority of companies probably do not have the bandwidth to invest and understand the future benefits of a digital thread or digital twins.

This does not mean that these topics should not be studied. If your business is in a small, simple eco-system and wants to work in a connected mode, you can choose a vendor and a few custom interfaces.

However, suppose you work in a global industry with an extensive network of partners, suppliers, and customers.

In that case, you cannot rely on ad-hoc interfaces or a single vendor. You need to invest in standards; you need to study common best practices to drive methodology, standards, and vendors to align.

This process of standardization is so crucial if you want to have a sustainable, connected enterprise. In the end, the push from these companies will lead to standards, allowing the smaller companies to ad-here or connect to.

The future is about Connected through Standards, as discussed in part 1 and further in this post. Let’s go!

Global Collaboration – Defining a baseline for data exchange processes and standards

Katheryn Bell (Pratt & Whitney Canada) presented the progress of the A&D Global Collaboration workgroup. As you can see from the project timeline, they have reached the phase to look towards the future.

Katheryn mentioned the need to standardize terminology as the first point of attention. I am fully aligned with that point; without a standardized terminology framework, people will have a misunderstanding in communication.

This happens even more in the smaller businesses that just pick sometimes (buzz) terms without a full understanding.

Several years ago, I talked with a PLM-implementer telling me that their implementation focus was on systems engineering. After some more explanations, it appeared they were making an attempt for configuration management in reality. Here the confusion was massive. Still, a standard, common terminology is crucial in our domain, even if it seems academic.

The group has been analyzing interoperability standards, standards for long-time archival and retrieval (LOTAR), but also has been studying the ISO 44001 standard related to Collaborative business relationship management systems

In the Q&A session, Katheryn explained that the biggest problem to solve with collaboration was the risk of working with the wrong version of data between disciplines and suppliers.

Of course, such errors can lead to huge costs if they are discovered late (or too late). As some of the big OEMs work with thousands of suppliers, you can imagine it is not an issue easily discovered in a more ad-hoc environment.

The move to a standardized Technical Data Package based on a Model-Based Definition is one of these initiatives in this domain to reduce these types of errors.

You can find the proceedings from the Global Collaboration working group here.

 

Connect, Trace, and Manage Lifecycle of Models, Simulation and Linked Data: Is That Easy?

I loved Alberto Ferrari‘s (Raytheon) presentation how he described the value of a model-based digital thread, positioning it in a targeted enterprise.

Click on the image and discover how business objectives, processes and models go together supported by a federated infrastructure.

Alberto’s presentation was a kind of mind map from how I imagine the future, and it is a pity if you have not had the chance to see his session.

Alberto also focused on the importance of various simulation capabilities combined with simulation lifecycle management. For Alberto, they are essential to implement digital twins. Besides focusing on standards, Alberto pleas for a semantic integration, open service architecture with the importance of DevSecOps.

Enough food for thought; as Alberto mentioned, he presented the corporate vision, not the current state.

More A&D Action Groups

There were two more interesting specialized sessions where teams from the A&D action groups provided a status update.

Brandon Sapp (Boeing) and Ian Parent (Pratt & Whitney) shared the activities and progress on Minimum Model-Based Definition (MBD) for Type Design Certification.

As Brandon mentioned, MBD is already a widely used capability; however, MBD is still maturing and evolving.  I believe that is also one of the reasons why MBD is not yet accepted in mainstream PLM. Smaller organizations will wait; however, can your company afford to wait?

More information about their progress can be found here.

Mark Williams (Boeing) reported from the A&D Model-Based Systems Engineering action group their first findings related to MBSE Data Interoperability, focusing on an Architecture Model Exchange Solution.  A topic interesting to follow as the promise of MBSE is that it is about connected information shared in models. As Mark explained, data exchange standards for requirements and behavior models are mature, readily available in the tools, and easily adopted. Exchanging architecture models has proven to be very difficult. I will not dive into more details, respecting the audience of this blog.

For those interested in their progress, more information can be found here

Model-Based Engineering @ Boeing

In this conference, the participation of Boeing was significant through the various action groups. As the cherry on the cake, there was Jeff Plant‘s session, giving an overview of what is happening at Boeing. Jeff is Boeing’s director of engineering practices, processes, and tools.

In his introduction, Jeff mentioned that Boeing has more than 160.000 employees in over 65 countries. They are working with more than 12.000 suppliers globally. These suppliers can be manufacturing, service or technology partnerships. Therefore you can imagine, and as discussed by others during the conference, streamlined collaboration and traceability are crucial.

The now-famous MBE Diamond symbol illustrates the model-based information flows in the virtual world and the physical world based on the systems engineering approach. Like Katheryn Bell did in her session related to Global Collaboration, Jeff started explaining the importance of a common language and taxonomy needed if you want to standardize processes.

Zoom in on the Boeing MBE Taxonomy, you will discover the clarity it brings for the company.

I was not aware of the ISO 23247 standard concerning the Digital Twin framework for manufacturing, aiming to apply industry standards to the model-based definition of products and process planning. A standard certainly to follow as it brings standardization on top of existing standards.

As Jeff noted: A practical standard for implementation in a company of any size. In my opinion, mandatory for a sustainable, connected infrastructure.

Jeff presented the slide below, showing their standardization internally around federated platforms.

This slide resembles a lot the future platform vision I have been sharing since 2017 when discussing PLM’s future at PLM conferences, when explaining the differences between Coordinated and Connected – see also my presentation here on Slideshare.

You can zoom in on the picture to see the similarities. For me, the differences were interesting to observe. In Jeff’s diagram, the product lifecycle at the top indicates the platform of (central) interest during each lifecycle stage, suggesting a linear process again.

In reality, the flow of information through feedback loops will be there too.

The second exciting detail is that these federated architectures should be based on strong interoperability standards. Jeff is urging other companies, academics and vendors to invest and come to industry standards for Model-Based System Engineering practices.  The time is now to act on this domain.

It reminded me again of Marc Halpern’s message mentioned in my previous post (part 1) that we should be worried about vendor alliances offering an integrated end-to-end data flow based on their solutions. This would lead to an immense vendor-lock in if these interfaces are not based on strong industry standards.

Therefore, don’t watch from the sideline; it is the voice (and effort) of the companies that can drive standards.

Finally, during the Q&A part, Jeff made an interesting point explaining Boeing is making a serious investment, as you can see from their participation in all the action groups. They have made the long-term business case.

The team is confident that the business case for such an investment is firm and stable, however in such long-term investment without direct results, these projects might come under pressure when the business is under pressure.

The virtual fireside chat

The conference ended with a virtual fireside chat from which I picked up an interesting point that Marc Halpern was bringing in. Marc mentioned a survey Gartner has done with companies in fast-moving industries related to the benefits of PLM. Companies reported improvements in accuracy and product development. They did not see so much a reduced time to market or cost reduction. After analysis, Gartner believes the real issue is related to collaboration processes and supply chain practices. Here lead times did not change, nor the number of changes.

Marc believes that this topic will be really showing benefits in the future with cloud and connected suppliers. This reminded me of an article published by McKinsey called The case for digital reinvention. In this article, the authors indicated that only 2 % of the companies interview were investing in a digital supply chain. At the same time, the expected benefits in this area would have the most significant ROI.

The good news, there is consistency, and we know where to focus for early results.

Conclusion

It was a great conference as here we could see digital transformation in action (groups). Where vendor solutions often provide a sneaky preview of the future, we saw people working on creating the right foundations based on standards. My appreciation goes to all the active members in the CIMdata A&D action groups as they provide the groundwork for all of us – sooner or later.

Meanwhile, two weeks of a partial lockdown have passed here in the Netherlands, and we have at least another 3 weeks to go according to the Dutch government. The good thing in our country, decisions, and measures are made based on the advice of experts as we cannot rely on politicians as experts.

I realize that despite the discomfort for me, for many other people in other countries, it is a tragedy. My mental support to all of you, wherever you are.

So what has happened since Time to Think (and act differently)?

All Hands On Deck

In the past two weeks, it has become clear that a global pandemic as this one requires an “All Hands On Deck” mentality to support the need for medical supplies and in particular respiration devices, so-called ventilators. Devices needed to save the lives of profoundly affected people. I have great respect for the “hands” that are doing the work in infectious environments.

Due to time pressure, innovative thinking is required to reach quick results in many countries. Companies and governmental organizations have created consortia to address the urgent need for ventilators. You will not see so much PR from these consortia as they are too busy doing the real work.

Still, you see from many of the commercial participants their marketing messages, why, and how they contribute to these activities.

One of the most promoted capabilities is PLM collaboration on the cloud as there is a need for real-time collaboration between people that are under lockdown. They have no time setting-up environments and learning new tools to use for collaboration.

For me, these are grand experiments, can a group of almost untrained people corporate fast in a new environment.

For sure, offering free cloud software, PLM, online CAD or 3D Printing, seems like a positive and compassionate gesture from these vendors. However, this is precisely the wrong perception in our PLM-world – the difficulty with PLM does not lie necessary in the tools.

 

It is about learning to collaborate outside your silo.

Instead of “wait till I am done” it should become “this is what I have so far – use it for your progress”. This is a behavior change.

Do we have time for behavioral changes at this moment? Time will tell if the myth will become a reality so fast.

A lot of thinking

The past two weeks were weeks of thinking and talking a lot with PLM-interested persons along the globe using virtual meetings.

As long as the lockdowns will be there I keep on offering free of charge PLM coaching for individuals who want to understand the future of PLM.

Through all these calls, I really became THE VirtualDutchman in many of these meetings (thanks Jagan for the awareness).

I realized that there is a lot of value in virtual meetings, in particular with the video option on. Although I believe video works well when you had met before as most of my current meetings were with people, I have met before face-to-face. Hence, you know each other facial expressions already.

I am a big fan of face-to-face meetings as I learned in the past 20 years that despite all the technology and methodology issues, the human factor is essential. We are not rational people; we live and decide by emotions.

Still, I conclude that in the future, I could do with less travel, as I see the benefits from current virtual meetings.

Less face-to-face meetings will help me to work on a more sustainable future as I am aware of the impact flying has on the environment. Also, talking with other people, there is the notion that after the lockdowns, virtual conferencing might become more and more a best practice. Good for the climate, the environment, and time savings – bad for traditional industries like aircraft carriers, taxis, and hotels. I will not say 100 % goodbye but reduce.

A Virtual PLM conference!

I was extremely excited to participate in the upcoming PLM Innovation Forum (PLMIF) starting on April 28th, organized by TECHNIA. I have been visiting the event in the past a few times in Stockholm. It was a great place to meet many of the people from my network.

This time I am even more excited as the upcoming PLMIF will be a VIRTUAL conference with all the aspects of a real conference – read more about the conference here.

There will be an auditorium where lectures will be given, there are virtual booths, and it will be a place to network virtually. In my next post, I hope to zoom in on the conference.

Sustainability, a circular economy, and modern PLM should go together. Since 2014, these topics have been on the agenda of the joint CIMdata Roadmap/PDT conferences. Speakers like Amir Rashid KTH Sweden, Ken Webster Ellen MacArthur Foundation, and many others have been talking about the circular economy.

The Scandinavian mindset for an inclusive society for people and the environment for sure, has influenced the agenda. The links above lead to some better understanding of what is meant by a circular economy and a sustainable future, as also the short YouTube movie below:

The circular economy is crucial for a sustainable future. Therefore, I am looking forward to participating in the upcoming PLM Innovation Forum on April 28th, where it will be all about digitalization for sustainable product development and manufacturing. Hopefully, with the right balance towards the WHY-side of our brain, not so much about WHAT.

You are welcomed to register for free here: the virtual PLM Innovation Forum – we might meet there (virtually).

The PLM Green Alliance

The PLM Green Alliance had been announced some months ago, started by Rich McFall and supported by  Bjorn Fidjeland,  Oleg Shilovitsky, and me.

It was the first step to proactively bringing people together to discuss topics like reducing our carbon footprint, sharing and brainstorming about innovations that will lead to a sustainable future for ourselves and our children, grand-grand-children. The idea behind the PLM Green Alliance is that a proactive approach is much cheaper in the long term as we can still evaluate and discuss options.

This brings me back to the All hands On Deck approach we currently use for fighting the COVID-19 virus.

In a crisis mode, the damage to the people and the economy is severe. Besides, in a crisis mode, a lot of errors will be made, but don’t blame or joke about these people that are trying. Without failure, there is no learning.

We are in a potential time of disruption as the image shows below, but we do not have the complete answers for the future

Think about how you could pro-actively work on a sustainable future for all of us. This will be my personal target, combined with explaining and coaching companies related to topics of modern PLM, during the current lockdown and hopefully long after. The PLM Green Alliance is eager to learn from you and your companies where they are contributing to a more sustainable and greener future.

Do not feel your contribution is not needed, as according to research done by the Carr Center’s Erica Chenoweth: The ‘3.5% rule’: How a small minority can change the world. It could be an encouragement to act instead of watching who will determine your future.

Conclusion

While learning to live in a virtual world, we might be realizing that the current crisis is an opportunity to switch faster to a more sustainable and inclusive society. For PLM moving to data-driven, cloud-based environments, using a Model-Based approach along the whole lifecycle, is a path to reduce friction when delivering innovations. From years to weeks? Something we wished to have today already. Stay safe!

This post is based on a mix of interactions I had the last two weeks in my network, mainly on LinkedIn.  First, I enjoyed the discussion that started around Yoann Maingon post: Thoughts about PLM Business models. Yoann is quite seasoned in PLM, as you can see from his LinkedIn profile, and we have had interesting discussions in the past, and recently about a new PLM-system, he is developing Ganister PLM, based on a flexible Graph database.

Perhaps in that context, Yoann was exploring the various business models. Do you pay for the software (and maintenance), do you pay through subscription, what about a modular approach or a full license for all the functionality? All these questions made me think about the various business models that I encountered and how hard it is for a customer to choose the optimal solution.  And is the space for a new type of PLM? Is there space for free PLM? Some of my thoughts here:

PLM vendors need to be profitable

One of the most essential points to consider is that whatever PLM solution you are aiming to buy, make sure that your PLM vendor has a profitable business model. As once you started with a PLM solution, it is your company’s IP that will be stored in this environment, and you do not want to change every few years your PLM system. Switching PLM systems would be affordable if the PLM system would store their data in a standard format – I will share a more in-depth link under PLM and standards.

For the moment, you cannot state PLM vendors endorse standards. None of the real PLM vendors have a standardized data model, perhaps closest to standards are Eurostep, who have based that ShareAspace solution on top of the PLCS (ISO 10303) standard. However, ShareAspace is more positioned as a type of middleware, connecting between OEMs/Owner/Operators and their suppliers to benefit for standardized connectivity.

Coming back to the statement, PLM Vendors need to be profitable to provide a guarantee for the future of your company’s data is the first step. The major PLM Vendors are now profitable as during a consolidation phase starting 15 years ago, a lot of non-profitable PLM Vendors disappeared. Matrix One, Agile, Eigner & Partner PLM are the best-known companies that were bought for either their technology or market share. In that context, you might also look at OnShape.

Would they be profitable as a separate company, or would investors give up? To survive, you need to be profitable, so giving software away for free is not a good sign (see the software for free paragraph) as a company needs continuity.

PLM startups

In the past 10 years, I have seen and evaluated several new PLM companies. All of them did not really change the PLM paradigm, most of them were still focusing on being an engineering collaboration tools. Several of these companies have in their visionary statement that they are going to be the “Excel killer.” We all know Excel has the best user interface and capabilities to manipulate a collection of metadata.

Very popular is the BOM in Excel, extracted from the CAD-system (no need for an “expensive” PDM or PLM) or BOM used to share with suppliers and stakeholders (ERP is too rigid, purchasing does not work with PDM).

The challenge I see here is that these startups do not bring real new value. The cost of manipulating Excels is a hidden cost, and companies relying on Excel communication are the type of companies that do not have a strategic point of view. This is typical for Small and Medium businesses where execution (“let’s do it”) gets all the attention.

PLM startups often collect investor’s money because they promise to kill Excel, but is Excel the real problem? Modern PLM is about data sharing, which is an attitude change, not necessarily a technology change from Excel tables to (cloud) shared tables. However, will one of these “new Excel killers” PLMs be disruptive? I don’t think so.

PLM disruption?

A week ago, I read an interview with Clayton Christensen (thanks Hakan Karden), which I shared on LinkedIn a week ago. Clayton Christensen is the father of the Disruptive Innovation theory, and I have cited him several times in my blogs. His theory is, in my opinion, fundamental to understand how traditional businesses can be disrupted. The interview took place shortly before he died at the age of 67. He died due to complications caused by leukemia.

A favorite part of this interview is, where he restates what is really Disruptive Innovation as we often talk about disruption without understanding the context, just echoing other people:

Christensen: Disruptive innovation describes a process by which a product or service powered by a technology enabler initially takes root in simple applications at the low end of a market — typically by being less expensive and more accessible — and then relentlessly moves upmarket, eventually displacing established competitors. Disruptive innovations are not breakthrough innovations or “ambitious upstarts” that dramatically alter how business is done but, rather, consist of products and services that are simple, accessible, and affordable. These products and services often appear modest at their outset but over time have the potential to transform an industry.

Many of the PLM startups dream and position themselves as the new disruptor.  Will they succeed? I do not believe so if they only focus on replacing Excel, there is a different paradigm needed. Voice control and analysis perhaps (“Hey PLM if I change Part XYZ what will be affected”)?

This would be disruptive and open new options. I think PLM startups should focus here if they want my investment money.

PLM for free?

There are some voices that PLM should be free in an analogy to software management and collaboration tools. There are so many open-source software management tools, why not using them for PLM? I think there are two issues here:

  • PLM data is not like software data. A lot of PLM data is based on design models (3D CAD / Simulation), which is different from software. Designs are often not that modular as software for various reasons. Companies want to be modular in their products, but do they have the time and resources to reinvent their existing product. For software, these costs are so much lower as it is only a brain exercise. For hardware, the impact is significant. Bringing me to the second point.
  • The cost of change for hardware is entirely different compared to software. Changing software does not have an impact on existing stock or suppliers and, therefore, can be implemented once tested for its purpose. A hardware change impacts the existing production process. First, use the old parts before introducing the change, or do we accept the (costs) of scrap. Is our supply chain, or are our production tools ready to deliver continuity for the new version? Hardware changes are costly, and you want to avoid them. Software changes are cheap, therefore design your products to be configurable based on software (For example Tesla’s software controlling the features to be allowed)

Now imagine, with enough funding, you could provide a PLM for free.  Because of ease of deployment, this would be very likely a cloud offering, easy and scalable. However, all your IP is in that cloud too, and let’s imagine that the cloud is safer than on-premise, so it does not matter in which country your data is hosted (does it ?).

Next, the “free” PLM provider starts asking a small service fee after five years, as the promised ROI on the model hasn’t delivered enough value for the shareholders, they become anxious. Of course, you do not like to pay the fee. However, where is your data, and what happens when you do not pay?

If the PLM provider switches you off, you are without your IP. If you ask the PLM provider to provide your data, what will you get? A blob of XML-files, anything you can use?

In general, this is a challenge for all cloud solutions.

  • What if you want to stop your subscription?
  • What is the allowed Exit-strategy?

Here I believe customers should ask for clarity, and perhaps these questions will lead to a renewed understanding that we need standards.

PLM and standards

We had a vivid discussion in the blogging community in September last year. You can read more related to this topic in my post: PLM and the need for standards which describes the aspects of lock-in and needs for openness.

Finally, a remark related to the PLM-acronym. Another interesting discussion started around Joe Barkai’s post: Why I do not do PLM . Read the comments and the various viewpoint on PLM here. It is clear that the word PLM unites us all; however, the interpretation is different.

If someone in the street asks me what is your profession, I never mention I do PLM. I say: “I assist mainly manufacturing companies in redesigning their business processes using best practices and modern digital technologies”. The focus is on the business value, not on the ultimate definition of PLM

Conclusion

There are many business aspects related to PLM to consider. Yoann Maingon’s post started the thinking process, and we ended up with the PLM-definition. It all illustrates that being involved in PLM is never a boring journey. I am curious to learn about your journey and where we meet.

To avoid that software geeks are getting curious about the title – in this context, ALM means Asset Lifecycle Management. In 2008 I was active for SmarTeam to promote PLM concepts relevant for Asset Lifecycle Management. The focus was on PLM being complementary to asset operation management (EAM Enterprise Asset Management and MRO – Maintenance Repair and Overhaul).

This topic has become actual for me in the past two months, having discussed and seen (PDT) the concepts of a model-based approach for assets and constructions. PLM, ALM, and BIM converge conceptually. Every year I give a one-day update from the field for students doing a master for PLM & BIM on top of their engineering/architectural background. Five years ago, there was no mentioning of BIM, now the ratio of BIM-oriented students has become significant. For me it is always great to see young students willing to learn PLM or BIM on top of their own skillset. Read more about this particular Master class in French when you click on the logo to the left.

In 2012 I started to explain PLM benefits to EPC companies (Engineering Procurement Construction), targeting a more profitable and efficient delivery of their constructions (oil platform, plant, building, infrastructure). The simplified reasoning behind using PLM was related to a more efficient and quality of multidisciplinary collaboration, reducing costly fixes during construction, and smoothening the intensive process of data handover.

More and more in the process industry, standards, like ISO 15926 (Process Industry) and ISO 19650 (BIM – mainly in the UK), became crucial.  At that time, it was difficult to convince companies to focus on the horizontal-integrated process instead of dedicated, disconnected tools. Meanwhile, this has changed, thanks to the Digital Twin hype. Let’s have a look.

PLM and ALM

The initial value for using PLM concepts complementary to MRO systems came from the fact that MRO systems are mainly focusing on plant operations. You could compare these systems with ERP systems for manufacturing companies, focusing execution and continuous operation. Scheduled maintenance and inspections are also driven by the MRO system. Typical MRO systems are Maximo and SAP PM. PLM could deliver configuration management, linking the design intent to the physical implementation. Therefore provide higher data quality, visibility, and traceability of the asset history.

The SmarTeam data model for Asset Lifecycle Management

In 2010, I shared these concepts in two posts: Asset Lifecycle Management using a PLM-system and PLM for Asset Lifecycle Management and Asset Development based on lessons learned with some (nuclear) plant owner/operators. They started to discover the need for configuration management to ensure data quality for operations. In 2010-2014 the business case using PLM complementary to MRO was data quality and therefore reduced down-time when executing large maintenance programs (dependencies between the individual projects were not visible without PLM)

In MRO-systems, like in ERP-systems, the data for execution is based on information coming from various engineering sources – specifications, PFDs, P&IDs.  Questions owner/operators ask themselves are:

  • What are the designed operational settings?
  • Are the asset parameters currently running as designed?
  • What is the optimized maintenance period?
  • Can we stretch maintenance intervals?
  • Can we reduce inspections?
  • Can we reduce downtime for maintenance and overhaul?
  • What about predictive maintenance?

Most of these questions are answered by experts that use their tacit knowledge and experience to give the best so far answers. And when the answers were wrong, they were accepted as new learning points. Next time we won’t make this mistake, and the experts become even more knowledgeable.

Now, these questions could be answered if you can model your asset in a virtual environment. In the virtual world, you would use simulation models, logical models, and 3D Models to describe the asset. This is where Model-Based Systems Engineering practices are used. However, these models need to be calibrated based on reality. And that is where IoT and Asset Operation Monitoring comes in connecting physical behavior with virtual predicted behavior. You can read more about this relationship in my post: Will MBSE the new PLM instead of IoT?

PLM and BIM

In 2014 when I started to discuss PLM concepts with EPC-companies (Engineering, Procurement, and Construction), mainly in the Oil & Gas industry. Here excellent asset development tools (AVEVA, Intergraph, Bentley) are the standard, and as the purpose of an EPC company is to deliver a plant or platform. Each software tool has its purpose and there is no lifecycle strategy.  The value PLM could bring was providing a program overview (complementary with Primavera), standardization, multidisciplinary coordination and visibility across projects to capture knowledge.

Most of the time, the EPC companies did not see the value of optimizing themselves as this was accepted in the process. Even while their productivity and cost due to poor quality (fixing during construction /commissioning) were absurd (10-20 % of the project budget). Cultural change – think longer instead of fix later – was hard to explain. In the end, the EPC was not responsible for operations, so why bother that much?

My blog posts: PLM for all Industries and 2014 – the year that the construction industry did not discover PLM illustrate the challenge at that time. None of the EPCs and construction companies had the, that improving collaboration based on information-continuity (not data-driven yet) could bring the significant benefits, despite their relatively low-profit margin (1- 3 % is considered excellent). Breaking the silos is too.

Two recent trends, however, changed the status quo that existed.

First of all, more and more, the owner/operator does not want to be responsible for the maintenance and operations of the asset. The typical EPC-companies now became DBO-companies (Design Build and Operate), this requires lifecycle thinking for these companies as most of the costs of an asset are during its maintenance and operation phase.

Advanced Thinking (read: (Model-Based) Systems Engineering) can help these companies to shift their focus on a more sustainable design of the asset for the future and get rewarded for that. In the old EPC-model, the target was “just” to deliver as specified.

A second significant trend is the availability of cloud infrastructure for the construction world. A cloud infrastructure does not require considerable investment for the stakeholders in a construction project. By introducing BIM in a common data environment (CDE), a comparable infrastructure to PLM is created and likely the Maintenance-and-Operatie stakeholder is eager to have the full virtual definition here for the future.

Read more about BIM and CDE for example, here: CDE – strategic BIM process tool.

Of course, technology and standards are there to collaborate. Now it is up to the stakeholders involved to develop new skills for collaboration (learn or hire) and implement them through new ways of working. A learning process can never be pushed by a big-bang, so make sure your company operates in two modes while learning.

As I mentioned the Maintenance-and-Operate stakeholders or in traditional cases, the Owner/Operators are incredibly interested in a well-defined virtual model of the asset. This allows them to analyze and simulate the implementation of fixes and enhancements for the future with an optimum result. Again we are talking about a digital twin of the asset here

Conclusion

Even though the digital twin is on the top of the Gartner Hype cycle, it has become already a vital principle to implement in particular for substantial, critical assets. As these precious assets, minor inefficiencies in data continuity can still be afforded to learn. From the moment companies have established a digital continuity between their virtual and physical assets, the concept for Digital Twin can also be profitable (and required) for other industries. In particular when these companies want to deliver their products as a service.

 

Note: I have been talking this year a lot about the challenges of digital transformation applied to PLM in particular. During PI PLMx London 2020 on February 3 and 4, I will lead a Think Thank session related to the challenge of connecting your PLM transformation to your executives’ vision (and budget). See you there ?

Potential digital transformation is everywhere. This time I want to share a personal story based on my IoT cycling device from Garmin. Several years ago I became an enthusiastic cyclist, mainly because it clears your mind and cycling keeps you in good shape after enjoying customer visits with great dinners and excellent breakfasts. As the Dutch lack real mountains, we challenge ourselves with through open fields with strong winds to suffer a little too.

 

Four years ago, started tracking my cycling performance, with a Garmin Edge 810. The story of my Garmin is a real IoT story. GPS trackers, in the beginning, did not communicate with the outside world. Now, this device connects to sensors registering my speed, my location, my heart rate, pedal cadence and produced power at any time, finally uploading it to the Garmin Connect platform.

The IoT platform

The Garmin Connect platform gives me insights on my performance, activities, and segments. The segment demonstrates the social part of the platform. Here you can see how you rank with others who have cycled the same track segment over time. And you can register your own preferred segment too, where you challenge yourself and others in your area. So the number of segments is growing continuously. Imagine all these cyclists around the world virtually sharing and taking the same track. I am curious to learn from Garmin how many people are connected to the platform.
I could not find these numbers. You?

The fun of segments

Digital Twin

Through the platform, Garmin collects huge amounts of data of connected users. Each data set of the connected user could be considered a simple digital twin. The Connect platform provides me insights about my overall performance through the years through various reports. Garmin could offer as a (paid) service to deliver insights of my performance compared to other users and propose predictive enhancements similar to the GE Predix platform. The difference of course that 1 % performance improvement for me in cycling does not bring the same value as 1 % performance improvement of a GE product (turbine, jet engine, train, …). However, the concept is the same and GE is promoting themselves as the next Digital Industrial Company, leading in digital transformation. Read more here.

Digital Twin performance

Connecting to the customer

Tthe change from moving from a document-driven approach towards a data-driven approach to collect and store information is not the main concept behind a digital transformation. The data-driven approach is an enabler to connect directly to the customer and change the current business model from delivering products into a business model delivering services or even more advanced delivering experiences. Services and experiences create a closer relation to the customer, more loyalty, but also the challenge that you need to connect to the customer in such a way that the customer sees value. Otherwise, the customer will switch to another service or experience. The Apple, Nespresso, Uber experiences are all known for their new ways of connecting to the customer, differentiating from traditional product sales. Garmin could also be on that list. However, I discovered they are not there yet, despite an IoT-platform and connected devices. What is missing?

Why Garmin is not a digital enterprise.

Two years ago my Garmin Edge started crashing in the middle of a ride. The system rebooted after some minutes, and the recordings were lost or at least unreadable.  When I contacted Garmin support their standard response was: “Please reset the device and update to the latest software.” Two years ago the software had still bug fixes. After two years you would expect a stable experience.

However, a year ago the problems started to become more frequent. I started to send log files illustrating where the error occurred. Still, the Garmin response was the same: “Please reset the device and update to the latest software.”
However as there were no new software updates, there must be another reason why the device failed more and more.

After pushing for a resolution, the service department concluded I needed a new device. There might be an issue with the hardware. A little bit skeptical I agreed on a hardware switch again, and as expected this did not solve the crashes. My guess is that due to the increasing amount of segments at some places, the software gets confused where the rider is exactly located and in which direction the rider is going. These are the moments when the crash happens, and this is probably a software issue.

Still, the Garmin help desk believes there is a hardware problem (preferably swap the device) where I kept on providing evidence data of crashes to support Garmin in their error-discovery. Till now there is no resolution. The good news is that Garmin support mentioned investigating further.

For me, the interaction with Garmin illustrates that the company internally is not yet digital transformed. The service desk probably has KPIs (Key Performance Indicators) related to their response time and problem resolution time. Although I can debate the response time, it is clear that the problem resolution approach: Update to the latest software and if this does not work swap to a new device is not increasing the knowledge from Garmin as a company what their customers are experiencing.

Apparently, their software management is disconnected from the service department and customers. Only clear bugs during the first launch are fixed. Next, it is a disconnected world again.

A must for a digital enterprise is to dive into customer issues and to connect them back to R&D, both for the hardware part and software part. Something a modern product manager would do. If a company is not able to understand the multidisciplinary dependencies and solve issues from the field (with some effort), they will keep on making the same mistakes again with new product launches and lose customers who are looking for a better experience.

My conclusion

PLM should be part of the digital enterprise too as this is the only way to deliver consistent customer value and positive experience. It requires companies to break down silos and create multidisciplinary teams that are capable of supporting the full customer journey. A digital device and a digital customer platform are just facades to the outside world – the inside needs to change too.

What do you think?
Does your company understand the challenges to transform across all disciplines?
Are you managing PLM, ALM, and IoT in context of the product and across the whole lifecycle?
I am curious !

PLM and IPTwo terms pass me every day: Digital Transformation appears in every business discussion, and IP Security, a topic also discussed in all parts of society. We realize it is easy to steal electronic data without being detected (immediately).

What is Digital Transformation?

Digital Transformation is reshaping business processes to enable new business models, create a closer relation with the market, and react faster while reducing the inefficiencies of collecting, converting and processing analog or disconnected information.

Digital Transformation became possible thanks to the lower costs of technology and global connectivity, allowing companies, devices, and customers to interact in almost real-time when they are connected to the internet.

IOTIoT (Internet of Things) and IIoT (Industrial Internet of Things) are terms closely related to Digital Transformation. Their focus is on creating connectivity with products (systems) in the field, providing a tighter relation with the customer and enabling new (upgrade) services to gain better performance. Every manufacturing company should be exploring IoT and IIoT possibilities now.

Digital Transformation is also happening in the back office of companies. The target is to create a digital data flow inside the company and with the outside stakeholders, e.g., customers, suppliers, authorities. The benefits are mainly improved efficiency, faster response and higher quality interaction with the outside world.

digitalPLMThe part of Digital Transformation that concerns me the most is the domain of PLM. As I have stated in earlier posts (Best Practices or Next Practices ? / What is Digital PLM ?), the need is to replace the classical document-driven product to market approach by a modern data-driven interaction of products and services.

I am continually surprised that companies with an excellent Digital Transformation profile on their websites have no clue about Digital Transformation in their product innovation domain. Marketing is faster than reality.

PIBerlin2017-1I am happy to discuss this topic with many of my peers in the product innovation world @ PI Berlin 2017, three weeks from now. I am eagerly looking to look at how and why companies do not embrace the Digital Transformation sooner and faster. The theme of the conference, “Digital Transformation: From Hype to Value “ says it all. You can find the program here, and I will report about this conference the weekend after.

IP Security

The topic of IP protection has always been high on the agenda of manufacturing companies. Digital Transformation brings new challenges. Digital information will be stored somewhere on a server and probably through firewalls connected to the internet. Some industries have high-security policies, with separate networks for their operational environments. Still, many large enterprises are currently struggling with IP security policies as sharing data while protecting IP between various systems creates a lot of administration per system.

dropboxCloud solutions for sharing data are still a huge security risk. Where is the data stored and who else have access to it? Dropbox came in the news recently as “deleted” data came back after five years, “due to a bug.” Cloud data sharing cannot be trusted for real sensitive information.

Cloud providers always claim that their solutions are safer due to their strict safety procedures compared to the improvident behavior of employees. And, this is true. For example, a company I worked with had implemented Digital Rights Management (DRM) for internal sharing of their IP, making sure that users could only read information on the screen, and not store it locally if they had an issue with the server. “No problem”, one of the employees said, “I have here a copy of the documents on my USB-drive.

lockedCloud-based PLM systems are supposed to be safer. However, it still matters where the data is stored; security and hacking policies of countries vary. Assume your company´s IP is safe for hacking. Then the next question is “How about ownership of your data?”

Vendor lock-in and ownership of data are topics that always comes back at the PDT conferences (see my post on PDT2016). When a PLM cloud provider stores your product data in a proprietary data format, you will always be forced to have a costly data migration project when you decide to change from the provider.

Why not use standards for data storage? Hakan Kårdén triggered me on this topic again with his recent post: Data Is The New Oil So Make Sure You Ask For The Right Quality.

 

Conclusion:

Digital Transformation is happening everywhere but not always with the same pace and focus. New PLM practices still need to be implemented on a larger scale to become best practices. Digital information in the context of Intellectual Property creates extra challenges to be solved. Cloud providers do not offer yet solutions that are safe and avoiding vendor lock-in.

Be aware. To be continued…

Many thanks (again) to Dick Bourke for his editing suggestions

 

clip_image001The past year I have written about PLM in the context of digital transformation, relevant for companies that deliver products to the market. Some years ago, I have advocated the value of a PLM infrastructure for EPC companies and Owners/Operators of a plant.

EPC stands for Engineering, Construction, and Procurement, a typical name for often large capital-intensive projects, executed by a consortium of companies. Together they create buildings, platforms, plants, infrastructure and more one-off deliveries, which will be under control of the Owner/Operator after going-live.

Some references:

2014 EPC related: The year the construction industry did not discover PLM

2013 Owner/Operators related: PLM for all industries?

As you can see from the dates, these posts are not the most recent posts. Meanwhile, EPC-based businesses are discovering the value of a PLM infrastructure. Main component for them is BIM (Building Information Model or Building Information Management) and they use cloud-based collaboration environments to be more cost-efficient. Slowly these companies are moving to a single repository of the data supporting multidisciplinary collaboration related to a BIM model to guarantee a continuity of data and better execution. I am positive about EPC companies that are discovering the value of PLM- It might be slightly different from classical product-selling companies, mainly because data ownership is different. In an EPC environment many companies are responsible for parts of the data and each of them keeps the real knowledge as IP (Intellectual Property) for themselves. They only “publish” deliverables. For companies that deliver products to the market, the OEM keeps responsibility for all relevant product information and h has a different strategy.

 

clip_image003I worked in the past with one of my peers, Bjorn Fidjeland (www.plmpartner.com) on PLM for EPCs and Owner/Operators. We share the same passion to bring PLM outside traditional industries. As Bjorn is now more active than I am in this domain, I recommend to read Bjorn´s posts on this topic. For example:

EPC related 2016: Handover to logistics and supply chain in capital projects

Owner/Operators 2015: Plant Information Management – Information Structures

Bjorn provides a lot of details, which are important as implementing PLM for EPCs or Owner/Operators requires different data structures. I wrote about these concepts in 2014 in two posts – PLM and/or SLM ?  post 1 and post 2. At that time not realizing the virtual twin was becoming popular.

PLM complementary to EAM

The last year I have explored these concepts together with (potential) Owner/Operators of a plant, where PLM would be complementary to their EAM system. In the world of Owner/Operators, Enterprise Asset Management (EAM) software is the major software these companies use. You find some of the major EAM players here.

You will discover that all these software suites are good for plant operations, but they all have a challenge to support data consistency and quality in particular when dealing with plant changes and efficient, high-quality  plant information management. Versioning and status management, typical PLM capabilities are often not there.

Owner/Operators have challenges with EAM environments as:

  • EAM systems are designed to support an as-operated environment, assuming all data it correct. Support for Maintenance, Repair or Overhaul projects is often rudimentary and depending on document-driven processes. The primary business process of these companies is producing continuously, such as, electricity or chemicals. Therefore typical engineering projects to change or enhance the main production process do not have the same financial focus.
  • A document-driven approach is the de facto standard common for these industries. Most of the time because the plant has been established through an EPC approach, which was 100 % document-driven due to the different disconnected disciplines/tools working at that time in the EPC project. As the asset information is stored and delivered in documents, most owners/operators keep the document-driven approach for future change projects.

Owners/operator can benefit significantly from a data-driven PLM system as complementary infrastructure to their EAM system. The PLM system will be the source for accurate asset information, manage the change and approvals for the assets and ultimately push the new released information to the EAM system. The PLM system will offer the full history an traceability of decisions made, important for regulatory bodies or insurance companies.

.A data-driven approach for asset information allows owners/operators to benefit from efficient processes, reducing strongly the amount of people required to process data (documents) or reducing the time for people working in maintenance and operations to search for data. I found a nice slide from IBM explaining the concept of PLM an EAM collaboration – see below:

clip_image005

The same benefits modern digital enterprises will have related to a data-driven approach will come available for owner/operators. Operational management is supported by the EAM system combined with real-time capabilities provided by a modern PLM systems to analyze, design and deliver changes to the plant without a costly data conversion process (e.g. compiling new documents) and disconnected processes.

Moving to a virtual twin

clip_image007Interesting enough the digital transformation is bringing the concepts of connecting engineering, manufacturing and operations together into an infrastructure of digital platforms interacting together. Where owners/operators historically do not focus on optimizing the engineering process to build and maintain their assets, in the “classical” industries companies were not really focusing on how products behaved in the field after they were delivered. With digital continuity (the digital thread) and IoT now these “classical” companies can connect to their products in the field. Their products become assets of information, and in case these companies change their business offering into leasing products and services, these assets become managed assets, like the assets owner/operators are managing.

The concept of a virtual twin (or digital twin – image proprietary of GE) , where a virtual model-based environment is linked to one or more real instances in operations, is the dream of all industries. Preparing, Simulating and verifying changes in a virtual world is so much more efficient and cheaper that is allows for higher quality of products and in the case of plant operators higher safety will be the number one topic.

Conclusion

What I have learned so far from plant owners/operators is that they are struggling to grasp a modern digital enterprise concept as their current environment is not model-based but document-driven. Starting with PLM to complement their EAM system could be a first step to understand the value and business benefits of digital continuity. It requires a new way of thinking which is not a commodity at this time. It will happen in the next 5 to 10 years. Expect it to be driven by the realization of virtual twins in the industry and further BIM maturity. The future is model-based !!!

p.s. I am happy to announce WordPress provided a new feature to my blog. In the side panel you can now choose your language (based on Google Translate) if you have difficulties with English. Enjoy !

NL-PLMAs a genuine Dutchman, I was able to spend time last month in the Netherlands, and I attended two interesting events: BIMOpen2015, where I was invited to speak about what BIM could learn from PLM (see Dutch review here) and the second event: Where engineering meets supply chain organized by two startup companies located in Yes!Delft an incubator place working close to the technical university of Delft (Dutch announcement here)

Two different worlds and I realized later, they potential have the same future. So let’s see what happened.

BIMopen 2015

bimopenBIMopen 2015 had the theme: From Design to Operations and the idea of the conference was to bring together construction companies (the builders) and the facility managers (the operators) and discuss the business value they see from BIM.

First I have to mention that BIM is a confusing TLA like PLM. So many interpretations of what BIM means. For me, when I talk about BIM I mean Building Information Management. In a narrower meaning, BIM is often considered as a Building Information Model – a model that contains all multidisciplinary information. The last definition does not deal with typical lifecycle operations, like change management, planning, and execution.

The BIMopen conference started with Ellen Joyce Dijkema from BDO consultants who addressed the cost of failure and the concepts of lean. Thinking. The high cost of failure is known and accepted in the construction industry, where at the end of the year profitability can be 1 % of turnover (with a margin of +/- 3 % – so being profitable is hard).

Lean thinking requires a cultural change, which according to Ellen Joyce is an enormous challenge, where according to a study done by Prof Dr. A. Cozijnsen there is only 19 % of chance this will be successful, compared to 40 % chance of success for new technology and 30 % of chance for new work processes.

succes

It is clear changing culture is difficult and in the construction industry it might be even harder. I had the feeling a large part of the audience did not grasp the opportunity or could find a way to apply it to their own world.

My presentation about what BIM could learn from PLM was similar. Construction companies have to spend more time on upfront thinking instead of fixing it later (costly). In addition thinking about the whole lifecycle of a construction, also in operations can bring substantial revenue for the owner or operator of a construction. Where traditional manufacturing companies take the entire lifecycle into account, this is still not understood in the construction industry.

This point was illustrated by the fact that there was only one person in the audience with the primary interest to learn what BIM could contribute to his job as facility manager and half-way the conference he still was not convinced BIM had any value for him.

PLMandBIM

A significant challenge for the construction industry is that there is no end-to-end ownership of data, therefore having a single company responsible for all the relevant and needed data does not exist. Ownership of data can result in legal responsibility at the end (if you know what to ask for) and in a risk shifting business like the construction industry companies try to avoid responsibility for anything that is not directly related to the primary activities.

Some larger companies during the conference like Ballast Nedam and HFB talked about the need to have a centralized database to collect all the data related to a construction (project). They were building these systems themselves, probably because they were not aware of PLM systems or did not see through the first complexity of a PLM system, therefore deciding a standard system will not be enough.

whyworryI believe this is short-term thinking as with a custom system you can get quick results and user acceptance (it works the way the user is asking for) however custom systems have always been a blockage for the future after 10-15 years as they are developed with a mindset from that time.

If you want to know, learn more about my thoughts have a look at 2014 the year the construction industry did not discover PLM. I will write a new post at the end of the year with some positive trends. Construction companies start to realize the benefits of a centralized data-driven environment instead of shifting documents and risks.

The cloud might be an option they are looking for. Which brings me to the second event.

Engineering meets Supply Chain

This was more an interactive workshop / conference where two startups KE-Works and TradeCloud illustrated the individual value of their solution and how it could work in an integrated way. I had been in touch with KE-Works before because they are an example of the future trend, platform-thinking. Instead of having one (or two) large enterprise system(s), the future is about connecting data-centric services, where most of them can run in the cloud for scalability and performance.

KE-Works provides a real-time workflow for engineering teams based on knowledge rules. Their solution runs in the cloud but connects to systems used by their customers. One of their clients Fokker Elmo explained how they want to speed up their delivery process by investing in a knowledge library using KE-works knowledge rules (an approach the construction industry could apply too)

image

In general if you look at what KE-works does, it is complementary to what PLM-systems or platforms do. They add the rules for the flow of data, where PLM-systems are more static and depend on predefined processes.

tradecloudTradeCloud provides a real-time platform for the supply chain connecting purchasing and vendors through a data-driven approach instead of exchanging files and emails. TradeCloud again is another example of a collection of dedicated services, targeting, in this case, the bottom of the market. TradeCloud connects to the purchaser’s ERP and can also connect to the vendor’s system through web services.

The CADAC group, a large Dutch Autodesk solution provided also showed their web-services based solution connecting Autodesk Vault with TradeCloud to make sure the right drawings are available. The name of their solution, the “Cadac Organice Vault TradeCloud Adapter” is more complicated than the solution itself.

observationWhat I saw that afternoon was three solutions providers connected using the cloud and web services to support a part of a company’s business flow. I could imagine that adding services from other companies like OnShape (CAD in the cloud), Kimonex (BOM Management for product design in the cloud) and probably 20 more candidates can already build and deliver a simplified business flow in an organization without having a single, large enterprise system in place that connects all.

The Future

InnovDilemmaI believe this is the future and potential a breakthrough for the construction industry. As the connections between the stakeholders can vary per project, having a configurable combination of business services supported by a cloud infrastructure enables an efficient flow of data.

As a PLM expert, you might think all these startups with their solutions are not good enough for the real world of PLM. And currently they are not – I agree. However disruption always comes unnoticed. I wrote about it in 2012 (The Innovators Dilemma and PLM)

Conclusion

Innovation happens when you meet people, observe and associate in areas outside your day-to-day business. For me, these two events connected some of the dots for the future. What do you think? Will a business process based on connected services become the future?

Sometimes we have to study careful to see patterns have a look here what is possible according to some scientists (click on the picture for the article)

 

image

Translate

Email subscription to this blog

Categories

%d bloggers like this: