You are currently browsing the category archive for the ‘PLM’ category.
The past half-year I have been intensively discussing potential PLM roadmaps with companies of different sizes and different maturity in PLM. Some companies are starting their PLM journey after many years of discussion and trying to identify the need and scope, others have an old PLM implementation (actually most of the time it is cPDM) where they discover that business paradigms from the previous century are no longer sufficient for the future.
The main changing paradigms are:
- From a linear product-driven delivery process towards an individual customer focused offering based on products and effective services, quickly -adapting to the market needs.
- From a document-driven, electronic files exchange based processes and systems towards data-driven platforms supporting information to flow in almost real-time through the whole enterprise.
Both changes are related and a result of digitization. New practices are under development as organizations are learning how to prepare and aim for the future. These new practices are currently dominating the agenda from all strategic consultancy firms as you cannot neglect the trend towards a digital enterprise. And these companies need next practices.
And what about my company?
It is interesting to see that most of the PLM implementers and vendors are promoting best practices, based on their many years of experience working having customers contributing to functionality in their portfolio.
And it is very tempting to make your customer feel comfortable by stating:
“We will implement our (industry) best practices and avoid customization – we have done that before!”
I am sure you have heard this statement before. But what about these best practices as they address the old paradigms from the past?
Do you want to implement the past to support the future?
Starting with PLM ? Use Best Practices !
If the company is implementing PLM for the first time and the implementation is bottom-up you should apply the old PLM approach. My main argument: This company is probably not capable/ready to work in an integrated way. It is not in the company´s DNA yet. Sharing data and working in a controlled environment is a big step to take. Often PLM implementations failed at this point as the cultural resistance was too big.
When starting with classical PLM, avoid customization and keep the scope limited. Horizontal implementations (processes across all departments) have more success than starting at engineering and trying to expand from there. An important decision to make at this stage is 2D leading (old) or the 3D Model leading (modern). Some future thoughts: How Model-based definition can fix your CAD models. By keeping the scope limited, you can always evolve to the next practices in 5 -10 years (if your company is still in business).
Note 1: remark between parenthesis is a little cynical and perhaps for the timeframe incorrect. Still, a company working bottom-up has challenges to stay in a modern competitive global environment.
Note 2: When writing this post I got notified about an eBook available with the tittle Putting PLM within reach written by Jim Brown. The focus is on cloud-based PLM solution that require less effort/investments on the IT-side and as side effect it discourages customization (my opinion) – therefore a good start.
Evolving in PLM – Next Practices
Enterprises that have already a PDM/PLM system in place for several years should not implement the best practices. They have reached the level that the inhibitors off a monolithic, document based environment are becoming clear.
They (must) have discovered that changing their product offering or their innovation strategy now with partners is adding complexity that cannot be supported easily. The good news, when you change your business model and product offering, there is C-level attention. This kind of changes do not happen bottom-up.
Unfortunate business changes are often discussed at the execution level of the organization without the understanding that the source of all products or offering data needs to be reorganized too. PLM should be a part of that strategic plan and do not confuse the old PLM with the PLM for the future.
The PLM for the future has to be built upon next practices. These next practices do not exists out of the box. They have to be matured and experienced by leading companies. The price you pay when being a leader Still being a leader bring market share and profit your company cannot meet when being a follower.
The Bi-modal approach
As management of a company, you do not want a disruption to switch from one existing environment to a new environment. Too much risk and too disruptive – people will resist – stress and bad performance everywhere. As the new data-driven approach is under development (we are learning), the end target is still moving.
Evolving using the old PLM system towards the new PLM approach is not recommended. This would be too expensive, slow and cumbersome. PLM would get a bad reputation as all the complexity of the past and the future are here. It is better to start the new PLM with a new business platform and customer-oriented processes for a limited offering and connect it to your legacy PLM.
Over the years the new PLM will become more clear and grow where the old PLM will become less and less relevant. Depending on the dynamics of your industry this might take a few years till decades.
It must and will be a business-driven learning path for new best practices
Best Practices and Next Practices are needed in parallel. Depending on the maturity and lack of sharing information in your company, you can choose. Consider the bi-modal approach to choose a realistic time path.
What do you think? Could this simplified way of thinking help your company?
Coming back from holiday (a life without PLM), it is time to pick up blogging again. And like every start, the first step is to make a status where we are now (with PLM) and where PLM is heading. Of course, it remains a nopinion based on dialogues I had this year.
First and perhaps this is biased, there is a hype in LinkedIn groups or in the blogs that I follow, a kind of enthusiasm coming from OnShape and Oleg Shilovitsky´s new company OpenBOM: the hype of cloud services for CAD/Data Management and BOM management.
Two years ago I discussed at some PLM conferences, that PLM should not necessary be linked to a single PLM system. The functionality of PLM might be delivered by a collection of services, most likely cloud-based, and these services together providing support for the product lifecycle. In 2014 I worked with Kimonex, an Israeli startup that developed a first online BOM solution, targeting the early design collaboration. Their challenge was to find customers that wanted to start this unknown journey. Cloud-based meant real-time collaboration, and this is also what Oleg wrote about last week: Real-time collaborative edit is coming to CAD & PLM
Real-time collaboration is one of the characteristics of a digital enterprise, where thanks to the fact information is stored as data, information can flow rapidly through an organization. Data can be combined and used by anyone in the organization in a certain context. This approach removes the barriers between PLM and ERP. To my opinion, there is no barrier between PLM and ERP. The barrier that companies create exists because people believe PLM is a system, and ERP is a system. This is the way of (system) thinking is coming from the previous century.
So is the future about cloud-based, data-driven services for PLM?
To my opinion systems are still the biggest inhibitor for modern PLM. Without any statistical analysis based on my impressions and gut feelings, this is what I see:
- The majority of companies that say the DO PLM, actually do PDM. They believe it is PLM because their vendor is a PLM company and they have bought a PLM system. However, in reality, the PLM system is still used as an advanced PDM system by engineering to push (sometimes still manual) at the end information into the well-established ERP system. Check with your company which departments are working in the PLM system – anyone beyond engineering ?
- There is a group of companies that have implemented PLM beyond their engineering department, connecting to their suppliers in the sourcing and manufacturing phase. Most of the time the OEMs forces their suppliers to deliver data in an old-fashioned way or sometimes more advanced integrated in the OEM environment. In this case, the supplier has to work in two systems: their own PDM/PLM environment and the OEM environment. Not efficient, still the way traditional PLM vendors promote partner / supply chain integration .
This is an area where you might think that a services-based environment like OnShape or OpenBOM might help to connect the supply chain. I think to so too. Still, before that we reach this stage there are some hurdles to overcome:
Persistence of data
The current generation in management of companies older than 20 years grew up with the fact that “owning data” is the only way to stay relevant in business. Even open innovation is a sensitive topic. What happens with data your company does not own because it is in the cloud in an environment you do not own (but share ?) . As long as companies insist on owning the data, a service-based PLM environment will not work.
A nice compromise at this time is ShareAspace from EuroStep. I wrote about ShareAspace last year when I attended sessions from Volvo Car (The weekend after PI Munich 2016) ShareAspace was used as a middleware to map and connect between two PLM/PDM environments. In this way, persistence of data remains. The ShareAspace data model is based on PLCS, which is a standard in the core industries. And standards are to my opinion the second hurdle for a services-based approach
A standard is often considered as overhead, and the reason for that is that often a few vendors dominate the market in a certain domain and therefore become THE standard. Similar to persistence of data, what is the value of data that you own but that you only can get access to through a vendor´s particular format?
Good for the short-term, but what about the long-term. (Most of the time we do not think about the long-term and consider interoperability problems as a given). Also, a services based PLM environment requires support for standards to avoid expensive interfaces and lack of long-term availability. Check in your company how important standards are when selecting and implementing PLM.
There is a nice hype for real-time collaboration through cloud solutions. For many current companies not good enough as there is a lot of history and the mood to own data. Young companies that discover the need for a modern services-based solution might be tempted to build such an environment. For these companies, the long-term availability might be a topic to study
Note: I just realized if you are interested in persistency and standards you should attend PDT 2016 on 9 & 1o November in Paris. Another interesting post just published from Lionel Grealou : Single Enterprise BOM: Utopia vs Dystopia is also touching this subject
Summer holidays are upcoming. Time to look back and reflect on what happened so far. As a strong believer that a more data-driven PLM is required to support modern customer-focused business models, I have tried to explain this message to many individuals around Europe with mixed success.
Compared to a year ago the notion of a new PLM approach, digital and data-driven, has been resonating more and more. Two years ago I presented at the Product Innovation conference in Berlin a session with the title: Did you notice PLM is changing ? The feedback at that time was that it was a beautiful story, probably happening in the far future. Last year in Düsseldorf ( my review here), the digital trend (s) became clearer. And this year in Munich (my review here), people mentioned upcoming changes were unavoidable, in particular in the relation with IoT, how it could drastically change existing business models.
For me, the enjoyable thing of the PI Conferences is that they give a snapshot of what people care the most in the context of their product development and in particular PLM. When you are busy in day-to-day business, everything seems to move slowly forward. However, by looking back, I must admit the pace of change has increased dramatically, not the same pace as it was five or ten years ago.
Something is happening, and it happens fast !
And here I want to encourage my readers to step back for a moment from day-to-day business and look around what is happening, in business and in the world. It is all related !
Jobs are disappearing in the middle class due to automation and direct connectivity with customers creates new types of businesses. Old jobs will never come back, not even when you close your border. And this is what worries many societies. This global, connected world has created a new way of doing business, challenging old and traditional businesses (and people) as their models become obsolete.
The primary reaction is trying to close the discomfort outside. Let´s act as if it never happened and just switch back to the good life in the previous century or centuries.
To be honest, it is all about the discomfort this new world brings to us. This new world requires new skills, in particular, more personal skills to develop continuously, learn and adapt for the future. Closing your mind and thought for the future, by hanging in the past, only brings you further away from the future and create more discomfort.
Are you talking PLM ?
Yes, the previous section was very generic, however also valid for PLM. Modern enterprises are changing the way they are going to do business and PLM is a crucial part of that total picture. Jeff Immelt, CEO of GE, explains in a discussion with Microsoft´s CEO Satya Nadella what it takes for an organization to be ready for the future. He does not talk about PLM, he talks about the need for people to be different in attitude and responsibilities – it is a business transformation – people first. Have a look here:
And although Jeff does not mention PLM, the changing digital business paradigm will affect all classical system, PLM, ERP, CRM. And your PLM vision and plans should anticipate for such a business transformation. Implementing PLM now in the same way is has been done for 10 years in the past, with the processes from the past in mind might make your company even more rigid than before. See my recent blog post: The value of PLM at the C-level.
Take this thought into consideration during your holidays. Can you be comfortable in this world by keep on hanging on the past or should you consider an uncomfortable, but crucial change the way your company will remain (flexible) in future business?
My holiday this year will be in my ultimate comfort zone at the beach. Reading books, no internet, discussing with friends what moves us. Two weeks to charge the batteries for this exciting, rapidly changing world of business (and PLM). I look forward coming back with some of my findings in my upcoming blogs.
Getting in and out of your comfort zone happens everywhere. Read this HBR article with a lot of similarities: If You’re Not Outside Your Comfort Zone, You Won’t Learn Anything
See you soon in the PLM (dis)comfort zone
If you have followed my blog the recent years, you might have discovered my passion for a modern, data-driven approach for PLM. (Read about it here: The difference between files and data-driven – a tutorial (3 posts)).
The data-driven approach will be the foundation for product development and innovation in a digital enterprise. Digital enterprises can outperform traditional businesses is such a way that within five to ten years, non-digital businesses will be considered as dinosaurs (if they still exist).
In particular, a digital enterprise is operating in an agile, iterative way with the customer continuously in focus, where traditional enterprises often work more in a linear way, pushing their products to the market (if the market still is waiting for these commodities).
Read more about this topic here: From a linear world to fast and circular?
When and how to become a digital enterprise?
It is (almost) inevitable your company will transform at a particular time into a digital enterprise too. Either driven by a vision to remain ahead of the competition or as a final effort to stay in business as competing against agile digital competitors is killing your market share.
One characteristic of a digital enterprise is that all benefits rely on accurate data flowing through the organization and its eco-system. And it does not matter if the data resides in a single system/platform like the major vendors are promoting or the fact that data is federated and consumed by the right person with the right role. I am a believer in the latter concept, still seeing current startups trying to create the momentum to achieve such an infrastructure. Have a look at my blog buddy’s company OpenBOM and Oleg’s recent article: The challenges of distributed BOM handover in data-driven manufacturing
No matter what you believe at this stage, the future is about accurate data. I bumped recently into some issues related actual data again. Some examples:
A change in objectives is needed!
One of the companies I am working with were only focusing on individual outputs, either in their drawings (yes, the 3D Model was not leading yet) or/and in their Excels (sounds familiar ? ). When we started implementing a PLM backbone, it became apparent during the discovery phase we could not use any advanced search tools to have quick wins by aggregating data for better understanding of the information we discovered. Drawings and Models did not contain any (file) properties. Therefore, the only way to understand information was by knowing its (file) name and potential its directory. Of course, the same file could be in multiple directories and as there were almost no properties, how to know what belongs to what item ?
When discussing the future of PLM with such companies, you always hear people (mainly engineers) say:
“we are not administrators, we need to get our job done.”
This shortsighted statement is often supported by management, and then you get stuck in the past.
It is time for the management and engineers to realize their future is also based on a data-driven approach. Therefore adding data to a drawing or CAD model, or in the case of PLM, part / process characteristics become the job of an engineer. We have to redefine roles as in a digital enterprise there is nobody to fix data downstream. People fixing data issues are too expensive.
I do not want to go digital
Most companies at this time are not ready for a digital enterprise yet. The changing paradigm is relatively new. Switching now to a modern approach cannot be done either because their culture is still based on the previous century or they are just in the middle of a standard PLM process, just learning to share files within their (global) origination. These companies might create an attitude:
“I do now want to go digital”
I believe this is ostrich behavior, like saying:
“I want all information printed on paper on my desk so I can work in comfort (and keep my job).”
History shows hanging to the past is killing for companies. Those companies that did not invest in the first electronic wave are probably out of business (unless they never had competition). The same for digital. In potentially ten years from now, it is not affordable to work in a traditional way anymore as labor cost and speed of information flowing through an organization are going to be crucial KPIs to stay in business.
As Dutch, we are always seeking compromises. It helped our country to become a leading trading nation and due to the compromises, we struggle less with strikes compared to our neighboring companies. Therefore my proposal for those who do not like digital at this stage: Add just a little digital workload to your day-to-day business, preferably stimulated and motivated by your management and promoted as a company initiative. By adding as much as possible relevant properties and context to your work, you will be working on the digital future of your company. When the times is there to become digital, it will be much easier to connect your old legacy information to the new digital platform, speeding up the business transformation.
And of course there will be tools
If you are observing what is happening in the PLM domain , you will see more and more tools for data discovery and data cleansing will appear on the market. Dick Bourke wrote end of last year an introduction article about this topic at Engineering.com: Is-Suspect-Product-Data-the-Elephant-in-the-Search-and-Discover-Room? Have a read to get interested.
And there are rewards
Once you have more accurate data, you can:
- Find it (saving search time)
- Create reports through automation (saving processing time)
- Apply rules (saving validation work & time or processing time)
- Create analytics (predict the future – priceless J)
We are in a transition phase the way PLM will is implemented. What is clear, no matter in which stage you are, accurate data is going to be crucial for the future? Use this awareness for your company to stay in business.
Finally, I have time to share my PLM experiences with you in this blog. The past months have been very busy as I moved to a new house, and I wanted to do and control a lot of activities myself. Restructuring your house in an agile way is not easy. Luckily there was a vision how the house should look like. Otherwise, the “agile” approach would be an approach of too many fixes. Costly and probably typical for many old construction projects.
Finally, I realized the beauty of IKEA´s modular design and experienced the variety of high-quality products from BLUM (an impressive company in Austria I worked with)
In parallel, I have been involved in some PLM discussions where in all cases the connection with the real C-level was an issue. And believe it or not, my blog buddy Oleg Shilovitsky just published a post: Hard to sell PLM? Because nobody gives a SH*T about PLM software. Oleg is really starting from the basics explaining you do not sell PLM; you sell a business outcome. And in larger enterprises I believe you sell at this time the ability to do a business transformation as business is becoming digital, with the customer in the center. And this is the challenge I want to discuss in this post
The value of PLM at the C-level
Believe it or not, it is easier to implement PLM (in general) instead of explaining a CEO why a company needs modern PLM. A nice one-liner to close this post, however, let me explain what I mean by this statement and perhaps show the reasons why PLM does not seem to be attractive so much at the C-level. I do not want to offend any particular PLM company, Consultancy firm or implementor, therefore, allow me to stay on a neutral level.
The C-level time challenge
First, let´s imagine the situation at C-level. Recently I heard an excellent anecdote about people at C-level. When they were kids, the were probably the brightest and able to process and digest a lot of information, making their (school) careers a success. When later arriving in a business environment, they were probably the ones that could make a difference in their job and for that reason climbed the career ladder fast to reach a C-level position. Then arriving at that level, they become too busy to dive really deep into the details.
Everyone around them communicates in “elevator speeches” and information to read must me extremely condensed and easy to understand. As if people at C-level have no brains and should be informed like small kids.
I have seen groups of people working weeks on preparing the messages for the CEO. Every word is twisted hundred times – would he or she understand it? I believe the best people at C-level have brains, and they would understand the importance of PLM when someone explains it. However, it requires time if it does not come from your comfort zone.
Who explains the strategic value of PLM
There are a lot of strategic advisory companies who have access to the board room, and we can divide them into two groups. The ones that focus on strategy independent of any particular solution and the ones that concentrate on a strategy, guaranteeing their implementation teams are ready to deploy the solution. Let´s analyze both options and their advice:
Independent of a particular solution
When a company is looking for help from a strategic consultancy firm, you know upfront part of the answer. As every consultancy firm has a preferred sweet spot, based on their principal consultant(s). As a PLM consultant, I probably imagine the best PLM approach for your company, not being expert in financials or demagogic trends. If the advisory company has a background in accountancy, they will focus their advice on financials. If the company has a background in IT, they will focus their information on an infrastructure concept saving so much money.
A modern digital enterprise is now the trend, where digital allows the company to connect and interact with the customer and therefore react faster to market needs or opportunities. IoT is one of the big buzz words here. Some companies grasp the concept of being customer centric (the future) and adapt their delivery model to that, not realizing the entire organization including their product definition process should be changing too. You cannot push products to the market in the old linear way, while meanwhile expecting modern agile work processes.
Most of the independent strategic consultants will not push for a broader scope as it is out of their comfort zone. Think for a moment. Who are the best strategic advisors that can talk about the product definition process, the delivery process and products in operation and service? I would be happy if you give me their names in the comments with proof points.
Related to a particular solution
When you connect with a strategic advisory company, which an extensive practice in XXX or YYY, you can be sure the result will be strategic advice containing XXX or YYY. The best approach with ZZZ will not come on the table, as consultancy firms will not have the intention to investigate in that direction for your company. They will tell you: “With XXX we have successfully transformed (many) other companies like yours, so choose this path with the lowest risk.
And this is the part what concerns me the most at this time. Business is changing rapidly and therefore PLM should be changing too. If not that would be a strange situation? Read about the PLM Identity crisis here and here.
The solution is at C-level (conclusion)
I believe the at the end the future of your company will be dependent on your DNA, your CEO and the C-level supporting the CEO. Consultancy firms can only share their opinion from their point of view and with their understanding in mind.
If you have a risk-averse management, you might be at risk.
Doing nothing or following the majority will not bring more competitive advantage.
The awareness that business is global and changing rapidly should be on every company’s agenda.
Change is always an opportunity to get better; still no outsider can recommend you what is the best. Take control and leadership. For me, it is clear that the product development and delivery process should be a part of this strategy. Call it PLM or something different. I do not care. But do not focus on efficiency and ROI, focus on being able to be different from the majority. Apple makes mobile phones; Nespresso makes coffee, etc.
Think and use extreme high elevators to talk with your C-level!
In 1999, I started my company TacIT in order to focus on knowledge management. The name TacIT came from the term tacit knowledge, the
knowledge an expert has, combining knowledge from different domains and making the right decision, based on his or her experience / intuition? Tacit knowledge is the opposite of explicit knowledge which you can define in rules. In particular, large companies are always looking for ways to capture and share knowledge to raise the tacit knowledge of their employees.
When I analyzed knowledge management in 1999, many businesses thought it was just about installing intranet. At that time, it became in fashion to have an internal website where people were publishing their knowledge. Wikipedia was not yet launched. Some people got excited from the intranet capabilities; however a lot of information remained locked or hidden. What was clear to me at that time was that knowledge management as a bottom-up approach would not work in an organization for the following reasons:
- In 1999 knowledge was power, so voluntary sharing your knowledge was considered more or less reducing your job security. Others might become as skilled as you. A friend of mine was trying to capture knowledge from experts in his domain and only people close to retirement were willing to speak with him. Has this attitude meanwhile changed?
- It takes time to share your knowledge and in particular for the busy experts this is a burden. They want (or need) to go on to the next job and not spend “useless” time to describe what they have learned.
My focus on knowledge management disappeared in 2000 as I got dragged into PLM with the excuse in mind that PLM should be a kind of knowledge management too.
No knowledge management in PLM
In theory, the picture representing PLM is a circle, where through iterations organizations learn to improve their products and understand better the way their products are perceived and performing in the market. However, the reality was that PLM was used as an infrastructure to transfer and share information mainly within engineering disciplines. Each department had its own tools and demands. Most companies have silos for PDM, ERP, and Services, and people have no clue about which information exist within the organization. Most of the time, they only know their system and even worse they are the only ones that know where their data is stored (or hidden when you talk to colleagues)
When PLM became more and more accepted as the backbone for product information in a company, there was more attention for a structured manner of knowledge management in the context of lessons learned. Quality systems like ISO900x provide guidance for processes of quality improvement. Various industries have their own quality methodology, APQP, 8D, CAPA all to ensure quality gets improved in a learning organization. 8D and CAPA are examples of Issue management which are a must-do for every PLM implementation. It is the first step in sharing and discovering commonalities and trends related to your product, your processes, and your customers. When issues are solved by email and phone calls, the content and lesson learned remain often hidden for the rest of the organization.
Still storing all information into one PLM system is not what I would call knowledge management. Also, my garbage bin (I had a huge one) contains all my written notes and thoughts. Would anyone be able to work with my environment? No!
Knowledge Management is an attitude
When organizations really care about knowledge, it should be a top-down guided process. And knowledge is more than storing data in a static manner in a central place. Let´s have a look how modern knowledge management could work:
In a PLM system you will find mainly structured information, i.e. Bill of Materials containing Parts, Documents/CAD Models/Drawings describing products, Catalogs with standard parts, Suppliers and in modern environment perhaps even issues related to these information objects and all the change processes that have been performed on parts, products or documents.
This information already becomes valuable information if companies upfront spend time on planning and creating the context of the information. This means attributes are important and even maintaining relationships between different types of information. This is the value a PLM system can bring beyond a standard document management system or a parts database. Information in the right context brings much more value.
For example, a “Where used” of a part not only in the context of a BOM but also in the context of suppliers, all issues, all ECRs/ECOs, projects or customers implemented. It could be any relation, starting from any relevant information object.
“Do your job as fast as possible and do only what is necessary to deliver now” is often the message from a short-sighted manager who believes that spending time on the “NOW” is more important than spending time on the “FUTURE.”
Managing information to become valuable in the future is an investment that needs to be done in the world of structured data. Once done, a company will discover that this investment has improved the overall performance of the company as time for searching will reduce (from 20 %++ to 5 % –) and people are enabled to reuse instead of reinventing things or worse re-experience issues.
There is more structured information out there.
Of course, companies cannot wait for a few years till structured information becomes usable. Most of the time there is already a lot of information in the various systems the company is using. Emails, the ERP system, the PDM system and file directories may contain already relevant information. Here modern search-based applications like Exalead or Conweaver (for sure there are more apps in the market – these are the two I am familiar with) will help to collect and connect information in context coming from various systems. This allows users to see information across disciplines and across the lifecycle of a product.
Still these capabilities are not really knowledge management increasing the tacit knowledge of a company
How to collect tacit knowledge ?
Static information collection does not contribute to tacit knowledge, it provides some visibility to what exists and might help with explicit knowledge. Tacit knowledge can only be collected by an active process. People in an organization need to be motivated and stimulated to share their story, which is more than just sharing information. It is the reasoning why certain decisions were taken which helps others to learn. Innovation and learning come from associating information coming from different domains and creating opportunity and excitement to share stories. This is what modern companies like Google and Apple do and it is somehow the same way as information is shared at the coffee machine. This is the primary challenge. Instead of an opportunistic approach to knowledge sharing you want a reliable process of knowledge sharing. The process of capturing and sharing tacit knowledge could be improved by assigning knowledge agents in a company.
A knowledge agent has the responsibility to capture and translate lessons learned. For that reasons, a knowledge agent should be somebody who can capitalize information and store and publish it in a manner the information can be found back in various contexts. The advantage of such a process is that knowledge is obtained in a structured manner. In the modern world, a knowledge agent could be a community owner / moderator actively sharing and publishing information. Strangely knowledge agents are often considered as overhead as their immediate value is not directly visible (as many of the PLM activities are) although the job of a knowledge agent does not need to be a full-time job.
I found a helpful link related to the knowledge management agent here: 7 knowledge management tips. The information is not in the context of product development. However, it is generic enough to consider.
Many companies talk about PLM and Knowledge Management as equivalents to each other. It should be clear that they are different although also partly overlapping in purpose. Import to understand that both PLM knowledge and general Knowledge Management will only happen with a top-down strategy and motivation for the organization, either by assigning individual people to become knowledge agents or to have common processes for all to follow up.
I am curious to learn:
- Is knowledge management on your company´s agenda
and if Yes
- How is knowledge management implemented
Last week I attended the PI conference in Munich, which has become a tradition since 2011. Personally, I have been busy moving houses, so blogging has not been my priority recently. However, the PI Conference for me is still one of the major events happening in Europe. Excellent for networking and good for understanding what is going on in the world of PLM. Approximate 200 delegates attended from various industries and. Therefore, the two days were good to find and meet the right people.
As the conference has many parallel sessions, I will give some of the highlights here. The beauty of this conference is that all sessions are recorded. I am looking forward to catch-up with other meetings in the upcoming weeks. Here some of the highlights of the sessions that I attended.
Some of the highlights
The first keynote session was from Mark Gallagher with the title: High-Performance Innovation in Formula One. Mark took us through the past and current innovations that have been taken place in the F1. I was involved some years ago in a PLM discussion with one of the F1 team.
I believe F1 is a dream for engineers and innovators. Instead of a long time to market, in F1, it is all about bringing innovation to the team as fast as possible. And interesting to see IoT, direct feedback from the car during the race is already a “commodity” in F1 – see the picture. Now we need to industrialize it.
Peter Bilello (CIMdata) took us through The Future Sustainability of PLM. One of the big challenges for PLM implementations is to make the sustainable. Currently, we see many PLM implementations reaching a state of obsolescence, no longer able to support the modern business for various reasons.
Change of owner, mergers, a different type of product, the importance of software. All of these reasons can become a significant challenge when your PLM implementation has been tuned to support the past.
How to be ready for the future. Peter concluded that companies need to be pro-active manage their systems and PLM platforms might give an answer for the future. However, these platforms need to be open and rely on standards, to avoid locking in data in the platform.
Final comment: To stay competitive in the future companies need to have an adequate strategy and vision.
Gary Knight, PLM Business Architecture Manager from Jaguar Land Rover, gave an impressive presentation about the complete approach JLR has executed. Yes, there is the technical solution. However the required cultural change and business change to align the vision with execution on the floor are as important. Making people enthusiastic and take part in realizing the future.
The traditional productivity dip during a business transformation has been well supported by intensive change management support, allowing the company to keep the performance level equal without putting its employees under big pressure. Many companies I have seen could learn from that.
PLM and ERP
In the afternoon, I moderated a focus group related to PLM and ERP integration challenges. An old-fashioned topic you might think. However, the room was full of people (too many) all hoping to find the answers they need. Some conclusions:
- Understanding the difference between owning data and sharing data. Where sharing still requires certain roles to be responsible for particular data sets.
- First define the desired process how information should flow between roles in the organization without thinking in tools. Once a common agreement exists, a technical realization will not be the bottleneck.
- PLM and ERP integrations vary per primary process (ETO, BTO, CTO, and MTS). In each of these processes the interaction between PLM and ERP will be different due to timing issues or delivery model
Irene Gustafson from Volvo Cars explained the integration concept with partners / suppliers based on Eurostep´s ShareAspace. I wrote about this concept in my blog post: The weekend after PDT2015. Meanwhile, the concept of a collaboration hub instead of direct integration between an OEM and its supplier has become more traction.
Irene Gustafson made some interesting closing statements
- Integration should not be built into the internal structure, it takes away flexibility
- A large portion of collaborative data is important here and now. Long term only a limited part of that data will need to be saved
Eurostep announced their new upcoming releases based on different collaboration scenarios, InReach, InControl and InLife. These packages allow fast and more OOTB deployment of their collaboration hub (based on de PLCS standard)
Digital Transformation at Philips and GE
Anosh Thakkar, Chief Technology Officer from Philips, explained their digital business transformation from pushing products to the markets towards a HealthTech company, leaving the lightning division behind. Philips used three “transformers” to guide the business change:
- From Complex to Simple, aligning businesses to 4 simplified business model (instead of 75) and one process framework supported by core IT platforms reducing customizations and many applications (from 8000 to 1000)
- From Analog to Digital, connecting customer devices through a robust cloud-based platform. A typical example of modern digital businesses
- From Products to Solution, again with a focus on the end-user how they could work in an ideal way instead of delivering a device (the Experience economy)
Ronan Stephan, chief scientist of GE, presented the digital business transformation is working on. Ronan took us through the transformation models of Amazon, Apple, and Google, explaining how their platforms and the insight coming from platform information have allowed these companies to be extremely successful. GE is aiming to be the leader in the digital industry, connecting their company with all their customers (aerospace, transportation, power & healthcare) on their Predix platform.
On the second day, I presented to a relatively small audience (5 parallel sessions – all interesting) a session with the title: The PLM Identity crisis. Luckily there were still people in the conference that have the feeling something is changing in PLM. My main message was that PLM like everything else in the current world suffers from rapid changing business models (hardware products towards software driven systems) and lack of time to distinguish between facts and opinions. The world of one-liners. To my opinion existing PLM, concepts are no longer enough, however, the PLM market still is mainly based on classical linear thinking as my generation (the baby boomers) are still leading the business. Have a look at the presentation here, of find a nice complementary related post from my blog buddy Oleg Shilovitsky here.
As I am in the middle of moving houses, now in no man’s land, I do not have the time and comfortable environment to write a more extensive review this time. Perhaps I will come back with some other interesting thoughts from this conference after having seen more recordings.
My observation after the conference:
A year ago I wrote The Weekend After Product Innovation 2015 in Düsseldorf where managing software in the context of PLM was the new topic. This year you could see the fast change as now IoT platforms and M2M communication was the main theme. The digital revolution is coming …..
Some weeks ago I wrote a post about non-intelligent part numbers (here) and this was (as expected) one of the topics that fired up other people to react. Thanks to Oleg Shilovitsky (here), Ed Lopategui (here), David Taber (here) for your contribution to this debate. For me, the interesting conclusion was that nobody denies the advantage of non-intelligent part number anymore. Five to ten years ago this discussion would be more a debate between defenders of the old “intelligent” methodology and non-intelligent numbers. Now it was more about how to deal/wait/anticipate for the future. Great progress !!
Non-intelligent part number benefits
Again a short summary for those who have not read the posts referenced in the introduction. Non-intelligent part numbers provide the following advantages:
- Flexibility towards the future in case of mergers, new products, and technologies of number ranges not foreseen. Reduced risk of changes and maintenance for part numbers in the future.
- Reduced support for “brain related connectivity” between systems (error prone) and better support for automated connectivity (interfaces / digital scanning devices). Minimizing mistakes and learning time.
So when a company decides to move forward towards non-intelligent part numbers, there are still some more actions to take. As the part number becomes irrelevant for human beings, there is the need for more human-readable properties provided as metadata on screens or attributes in a report.
CLASSIFICATION: The first obvious need is to apply a part classification to your parts. Intelligent part numbers somehow were often a kind of classification based on the codes and position of numbers and characters inside the intelligent ID. The intelligent part number containing information about the type of part, perhaps the drawing format, the project or the year it was issued the first time. You do not want to lose this information and therefore, make sure it is captured in attributes (e.g. part type / creation date) or in related information (e.g. drawing properties, model properties, customer, project). In a modern PLM system, all the intelligence of a part number needs to be at least stored as metadata and relations.
Which classification to use is hard to tell. It depends on your industry and the product you are making. Each industry has it standards which are probably the optimized target when you work in that industry. Classifications like UNSPC might be too generic. Although when you classify, do not invent a new classification yourself. People have spent thousands of hours (millions perhaps) on building the best classification for your industry – don’t be smarter unless you are a clever startup.
And next, do not rely on a single classification. Make sure your parts can adhere to multiple classifications as this is the best way to stay flexible for the future. Multiple classifications can offer support for a marketing view, a technology view (design and IP usage), a manufacturing view and so on.
Legacy parts should be classified by using analytic tools and custom data manipulations to complete the part metadata in the future environment. There are standard tools in the market to support data discovery and quality improvement. Part similarity discovery done by Exalead’s One Part and for more specific tools read Dick Bourke’s article on Engineering.com.
DOWNSTREAM USAGE: As Mathias Högberg commented on my post, the challenge of non-intelligent part numbers has its impact downstream on the shop floor. Production line scheduling for variants or production process steps for half-fabricates often depends on the intelligence of the part number. When moving to non-intelligent numbers, these capabilities have to be addressed too, either by additional attributes, immediately identifying product families or by adding a more standardized description based on the initial attributes of the classification. Also David Taber in his post talked about two identifiers, one meaningless and fixed and a second used for the outside world, which could be build by a concatenation of attributes and can change during the part lifecycle.
In the latter case, you might say, we remove intelligence from the part number and we bring intelligence back in the description. This is correct. Still human beings are better in mapping a description in their mind than a number.
Do you know Jos Voskuil (a.k.a. virtualdutchman) or
Do you know NL 13.012.789 / 56 ?
Quality of data
Moving from “intelligent” part numbers towards meaningless part numbers enriched with classification and a standardized description, allow companies to gain significant benefits for just part reuse. This is what current enterprises are targeting. Discovering and eliminating similar parts already justifies this process. I consider this as a tactical advantage. The real strategic advantage will come in the next ten years when we will go more and more to a digital enterprise. In a digital enterprise, algorithms will play a significant role (see Gartner) amount of human interpretation and delays. However, algorithms only work on data with certain properties and a reliable quality.
Introducing non-intelligent part numbers has it benefits and ROI to stay flexible for the future. However consider it also as a strategic step for the long-term future when information needs to flow in an integrated way through the enterprise with a minimum of human handling.
Happy New Year to all of you and I am wishing you all an understandable and digital future. This year I hope to entertain you again with a mix of future trends related to PLM combined with old PLM basics. This time, one of the topics that are popping up in almost every PLM implementation – numbering schemes – do we use numbers with a meaning, so-called intelligent numbers or can we work with insignificant numbers? And of course, the question what is the impact of changing from meaningful numbers towards unique meaningless numbers.
Why did we create “intelligent” numbers?
Intelligent part numbers were used to help engineers and people on the shop floor for two different reasons. As in the early days, the majority of design work was based on mechanical design. Often companies had a one-to-one relation between the part and the drawing. This implied that the part number was identical to the drawing number. An intelligent part number could have the following format: A4-95-BE33K3-007.A
Of course, I invented this part number as the format of an intelligent part number is only known to local experts. In my case, I was thinking about a part that was created in 1995, drawn on A4. Probably a bearing of the 33K3 standard (another intelligent code) and its index is 007 (checked in a numbering book). The version of the drawing (part) is A
A person, who is working in production, assembling the product and reading the BOM, immediately knows which part to use by its number and drawing. Of course the word “immediately” is only valid for people who have experience with using this part. And this was in the previous century not so painful as it is now. Products were not so sophisticated as they are now and variation in products was limited.
Later, when information became digital, intelligent numbers were also used by engineering to classify their parts. The classification digits would assist the engineer to find similar parts in a drawing directory or drawing list.
And if the world had not changed, there would be still intelligent part numbers.
Why no more intelligent part numbers?
There are several reasons why you would not use intelligent part numbers anymore.
- An intelligent number scheme works in a perfect world where nothing is changing. In real life companies merge with other companies and then the question comes up: Do we introduce a new numbering scheme or is one of the schemes going to be the perfect scheme for the future?If this happened a few times, a company might think: Do we have to through this again and again? As probably topic #2 has also occurred.
- The numbering scheme does not support current products and complexity anymore. Products change from mechanical towards systems, containing electronic components and embedded software. The original numbering system has never catered for that. Is there an overreaching numbering standard? It is getting complicated, perhaps we can change ? And here #3 comes in.
- As we are now able to store information in a digital manner, we are able to link to this complex part number a few descriptive attributes that help us to identify the component. Here the number is becoming less important, still serving as access to the unique metadata. Consider it as a bar code on a product. Nobody reads the bar code without a device anymore and the device connected to an information system will provide the right information. This brings us to the last point #4.
- In a digital enterprise, where data is flowing between systems, we need unique identifiers to connect datasets between systems. The most obvious example is the part master data. Related to a unique ID you will find in the PDM or PLM system the attributes relevant for overall identification (Description, Revision, Status, Classification) and further attributes relevant for engineering (weight, material, volume, dimensions).
In the ERP system, you will find a dataset with the same ID and master attributes. However here they are extended with attributes related to logistics and finance. The unique identifier provides the guarantee that data is connected in the correct manner and that information can flow or connected between systems without human interpretation or human-spent processing time.
What to do now in your company?
There is no business justification just to start renumbering parts just for future purposes. You need a business reason. Otherwise, it will only increase costs and create a potential for migration errors. Moving to meaningless part numbers can be the best done at the moment a change is required. For example, when you implement a new PLM system or when your company merges with another company. At these moments, part numbering should be considered with the future in mind.
And the future is no longer about memorizing part classifications and numbers, even if you are from the generation that used to structure and manage everything inside your brain. Future businesses rely on digitally connected information, where a person based on machine interpretation of a unique ID will get the relevant and meaningful data. Augmented reality (picture above) is becoming more and more available. It is now about human beings that need to get ready for a modern future.
Intelligent part numbers are a best practice from the previous century. Start to think digital and connected and try to reduce the dependency of understanding the part number in all your business activities. Move towards providing the relevant data for a user. This can be an evolution smoothening a future PLM implementation step.
Looking forward to discussing this topic and many other PLM related practices with you face to face during the Product Innovation conference in Munich. I will talk about the PLM identity change and lead a focus group session about PLM and ERP integration. Looking from the high-level and working in the real world. The challenge of every PLM implementation.
- It does not make sense to define the future of PLM
- PLM is not an engineering solution anymore
- Linearity of business is faster becoming a holdback
- The Product in PLM is no longer a mechanical Product
- Planet Lifecycle Management has made a next major step
It does not make sense to define the future of PLM
At the beginning of this year, there was an initiative to define the future of PLM for 2025 to give companies, vendors, implementors a guidance to what is critical and needed for PLM in 2015. Have a read here: The future of PLM resides in Brussels.
I believe it is already hard to agree what has been the recognized scope of PLM in the past 10 years, how can we define the future of PLM for the next 10 years. There are several trends currently happening (see the top 5 above) that all can either be in or out of scope for PLM. It is no longer about the definition of PLM; it is dynamically looking towards how businesses adapt their product strategy to new approaches.
Therefore, I am more curious how Product Innovation platforms or Business Innovation platforms will evolve instead of focusing on a definition of what should be PLM in 2025. Have a further look here, such as, The Next Step in PLM’s Evolution: Its Platformization a CIMdata positioning paper.
Conclusion: The future is bright and challenging, let´s not fence it in by definitions.
PLM is not an engineering solution anymore
More and more in all the discussions I had this year with companies looking into PLM, most of them see now PLM as a product information backbone throughout the lifecycle, providing a closed-loop of information flow and visibility across all discipline.
End-to-end visibility, End-to-end tractability, Real-time visibility were some of the buzz-words dropped in many meetings.
These words really express the change happening. PLM is no longer an engineering front-end towards ERP; PLM interacts at each stage of the product lifecycle with other enterprise systems.
End-to-end means when products are manufactured we still follow them through the manufacturing process (serialization) and their behavior in the field (service lifecycle management/field analytics).
All these concepts require companies to align in a horizontal manner, instead of investing in optimizing their silos. Platformization, as discussed above, is a logic step for extending PLM.
Conclusion: If you implement PLM now, start thinking first about the end-to-end flow of information. Or to be more concrete: Don´t be tempted to start with engineering first. It will lock your new PLM again in an extended PDM silo.
Linearity of business is faster becoming a holdback
Two years ago I started talking about: Did you notice PLM is changing ? This topic was not in the mainstream of PLM discussions two years ago. Now with the introduction of more and more software in products (products become systems), the linear process of bringing a product to the market has become a holdback.
The market /your customers expect faster and incremental innovations/ upgrades preferably without having to invest again in a new product. If you look back, the linear product development approach has not changed since the Second World War. We automated more and more the linear process. Remember the New Product Introduction hype around 2004 -2006, where companies started to extend the engineering process with a governance process to follow a product´s introduction from its early concept phase toward a globally available product. This process is totally linear. I wrote about it in my post: from a linear world to fast and circular, where the word circular is also addressing the change of delivering products as a service instead of deliver once and scrap them.
One of my favorite presentations is from Chris Armbruster: Rethinking Business for Exponential Times – enjoy if you haven´t seen this one.
Conclusion: The past two years the discussion related to modern, data-driven dynamic products and services has increased rapidly. Now with IoT, it has become a hype to be formalized soon as life goes faster and faster.
The Product in Product Lifecycle Management is no longer a mechanical Product
When I started to implement PDM systems in the nineties, we tried to keep electrical engineering outside the scope as we had no clue how to manage their information in the context of a mechanical design. It was very rudimentary. Now PLM best practices exist to collaborate and synchronize around the EBOM in an integrated manner.
The upcoming challenge now is due to the software used in products, which turn them into systems. And not only that, software can be upgraded in a minute. So the classical ECR / ECO processes designed for hardware are creating too much overhead. Agile is the motto for software development processes. Now, we (PLM consultants/vendors) are all working on concepts and implementations where these worlds come together. PLM (Product Lifecycle Management), ALM (Asset Lifecycle Management) and SysLM (System Lifecycle Management as introduced by Prof. Martin Eigner – have a read here) are all abbreviations representing particular domains that need to flow together.
Conclusion: For most companies their products become systems with electronics and software. This requires new management and governance concepts. The challenge for all vendors & implementors.
Planet Lifecycle Management has made a next major step
Finally good news came in the beginning of December, where for the first time all countries agreed that our planet needs to have a sustainable lifecycle. Instead of the classical lifecycle from cradle to grave we want to apply a sustainable lifecycle to this planet, when it is still possible. This decision is a major breakthrough pushing us all to leave the unsustainable past behind and to innovate and work on the future. The decisions taken in Paris should be considered as a call for innovative thinking. PLM can learn from that as I wrote earlier this year in my post PLM and Global Warming
Conclusion: 2015 was a year where some new trends became clear. Trends will become commodity faster and faster. A challenge for all of us to stay connected and understand what is happening. Never has the human brain challenged before to adapt to change with such an impact.
Closing 2015 means for me a week of quietness and stepping out of the fast lane. I wish you all a healthy 2016 with a lot of respect, compromises and changing viewpoints. The current world is too complex to solve issues by one-liners.
Take your time to think and reflect – it works!
SEE AND HEAR YOU BACK IN 2016
Topics discussed in 2014-2015
- The importance of a PLM data model: EBOM – MBOM
- EBOM and (CAD) Documents
- Some more EBOM methodology
- Products, BOMs, and Parts
- The Importance of a PLM data model
PLM and Business Change
- Modern PLM brings Power to the People
- The Innovator’s dilemma and Generation change
- The importance of change management with PLM
- PLM and Global warming
- Breaking down the silos with data
- From a linear world to fast and circular ?
From a linear world to a circular and fast-blog