You are currently browsing the category archive for the ‘PLM’ category.

IntNumberSome weeks ago I wrote a post about non-intelligent part numbers (here) and this was (as expected) one of the topics that fired up other people to react. Thanks to Oleg Shilovitsky (here), Ed Lopategui (here), David Taber (here)  for your contribution to this debate. For me, the interesting conclusion was that nobody denies the advantage of non-intelligent part number anymore. Five to ten years ago this discussion would be more a debate between defenders of the old “intelligent” methodology and non-intelligent numbers. Now it was more about how to deal/wait/anticipate for the future. Great progress !!

 

Non-intelligent part number benefits

Again a short summary for those who have not read the posts referenced in the introduction. Non-intelligent part numbers provide the following advantages:

  • Flexibility towards the future in case of mergers, new products, and technologies of number ranges not foreseen. Reduced risk of changes and maintenance for part numbers in the future.
  • Reduced support for “brain related connectivity” between systems (error prone) and better support for automated connectivity (interfaces / digital scanning devices). Minimizing mistakes and learning time.

 

What’s next?

So when a company decides to move forward towards non-intelligent part numbers, there are still some more actions to take. As the part number becomes irrelevant for human beings, there is the need for more human-readable properties provided as metadata on screens or attributes in a report.

CLASSIFICATION: The first obvious need is to apply a part classification to your parts. Intelligent part numbers somehow were often a kind of classification based on the codes and position of numbers and characters inside the intelligent ID. The intelligent part number containing information about the type of part, perhaps the drawing format, the project or the year it was issued the first time. You do not want to lose this information and therefore, make sure it is captured in attributes (e.g. part type / creation date) or in related information (e.g. drawing properties, model properties, customer, project). In a modern PLM system, all the intelligence of a part number needs to be at least stored as metadata and relations.

classificationWhich classification to use is hard to tell. It depends on your industry and the product you are making. Each industry has it standards which are probably the optimized target when you work in that industry. Classifications like UNSPC might be too generic. Although when you classify, do not invent a new classification yourself. People have spent thousands of hours (millions perhaps) on building the best classification for your industry – don’t be smarter unless you are a clever startup.

And next, do not rely on a single classification. Make sure your parts can adhere to multiple classifications as this is the best way to stay flexible for the future. Multiple classifications can offer support for a marketing view, a technology view (design and IP usage), a manufacturing view and so on.

Legacy parts should be classified by using analytic tools and custom data manipulations to complete the part metadata in the future environment. There are standard tools in the market to support data discovery and quality improvement. Part similarity discovery done by Exalead’s One Part and for more specific tools read Dick Bourke’s article on Engineering.com.

DOWNSTREAM USAGE: As Mathias Högberg commented on my post, the challenge of non-intelligent part numbers has its impact downstream on the shop floor. Production line scheduling for variants or production process steps for half-fabricates often depends on the intelligence of the part number. When moving to non-intelligent numbers, these capabilities have to be addressed too, either by additional attributes, immediately identifying product families or by adding a more standardized description based on the initial attributes of the classification. Also David Taber in his post talked about two identifiers, one meaningless and fixed and a second used for the outside world, which could be build by a concatenation of attributes and can change during the part lifecycle.

VIRTUALIn the latter case, you might say, we remove intelligence from the part number and we bring intelligence back in the description. This is correct. Still human beings are better in mapping a description in their mind than a number.

Do you know Jos Voskuil (a.k.a. virtualdutchman) or
Do you know NL 13.012.789 / 56 ?

 

Quality of data

Moving from “intelligent” part numbers towards meaningless part numbers enriched with classification and a standardized description, allow companies to gain significant benefits for just part reuse. This is what current enterprises are targeting. Discovering and eliminating similar parts already justifies this process. I consider this as a tactical advantage. The real strategic advantage will come in the next ten years when we will go more and more to a digital enterprise. In a digital enterprise, algorithms will play a significant role (see Gartner) amount of human interpretation and delays. However, algorithms only work on data with certain properties and a reliable quality.

Conclusion

Introducing non-intelligent part numbers has it benefits and ROI to stay flexible for the future. However consider it also as a strategic step for the long-term future when information needs to flow in an integrated way through the enterprise with a minimum of human handling.

 

thinkHappy New Year to all of you and I am wishing you all an understandable and digital future. This year I hope to entertain you again with a mix of future trends related to PLM combined with old PLM basics. This time, one of the topics that are popping up in almost every PLM implementation – numbering schemes – do we use numbers with a meaning, so-called intelligent numbers or can we work with insignificant numbers? And of course, the question what is the impact of changing from meaningful numbers towards unique meaningless numbers.

Why did we create “intelligent” numbers?

IntNumberIntelligent part numbers were used to help engineers and people on the shop floor for two different reasons. As in the early days, the majority of design work was based on mechanical design. Often companies had a one-to-one relation between the part and the drawing. This implied that the part number was identical to the drawing number. An intelligent part number could have the following format: A4-95-BE33K3-007.A

Of course, I invented this part number as the format of an intelligent part number is only known to local experts. In my case, I was thinking about a part that was created in 1995, drawn on A4. Probably a bearing of the 33K3 standard (another intelligent code) and its index is 007 (checked in a numbering book). The version of the drawing (part) is A

A person, who is working in production, assembling the product and reading the BOM, immediately knows which part to use by its number and drawing. Of course the word “immediately” is only valid for people who have experience with using this part. And this was in the previous century not so painful as it is now. Products were not so sophisticated as they are now and variation in products was limited.

Later, when information became digital, intelligent numbers were also used by engineering to classify their parts. The classification digits would assist the engineer to find similar parts in a drawing directory or drawing list.

And if the world had not changed, there would be still intelligent part numbers.

Why no more intelligent part numbers?

There are several reasons why you would not use intelligent part numbers anymore.

  1. PerfectWorldAn intelligent number scheme works in a perfect world where nothing is changing. In real life companies merge with other companies and then the question comes up: Do we introduce a new numbering scheme or is one of the schemes going to be the perfect scheme for the future?If this happened a few times, a company might think: Do we have to through this again and again? As probably topic #2 has also occurred.
  2. The numbering scheme does not support current products and complexity anymore. Products change from mechanical towards systems, containing electronic components and embedded software. The original numbering system has never catered for that. Is there an overreaching numbering standard? It is getting complicated, perhaps we can change ? And here #3 comes in.
  3. BarCodeAs we are now able to store information in a digital manner, we are able to link to this complex part number a few descriptive attributes that help us to identify the component. Here the number is becoming less important, still serving as access to the unique metadata. Consider it as a bar code on a product. Nobody reads the bar code without a device anymore and the device connected to an information system will provide the right information. This brings us to the last point #4.
  4. In a digital enterprise, where data is flowing between systems, we need unique identifiers to connect datasets between systems. The most obvious example is the part master data. Related to a unique ID you will find in the PDM or PLM system the attributes relevant for overall identification (Description, Revision, Status, Classification) and further attributes relevant for engineering (weight, material, volume, dimensions).
    In the ERP system, you will find a dataset with the same ID and master attributes. However here they are extended with attributes related to logistics and finance. The unique identifier provides the guarantee that data is connected in the correct manner and that information can flow or connected between systems without human interpretation or human-spent processing time.

GartnerWorkforceAnd this is one of the big benefits of a digital enterprise, reducing overhead in data handling, often reducing the cost of data handling with 50 % or more (people / customizations)

 

What to do now in your company?

There is no business justification just to start renumbering parts just for future purposes. You need a business reason. Otherwise, it will only increase costs and create a potential for migration errors. Moving to meaningless part numbers can be the best done at the moment a change is required. For example, when you implement a new PLM system or when your company merges with another company. At these moments, part numbering should be considered with the future in mind.

augmentedAnd the future is no longer about memorizing part classifications and numbers, even if you are from the generation that used to structure and manage everything inside your brain. Future businesses rely on digitally connected information, where a person based on machine interpretation of a unique ID will get the relevant and meaningful data. Augmented reality  (picture above) is becoming more and more available. It is now about human beings that need to get ready for a modern future.

 

Conclusion

Intelligent part numbers are a best practice from the previous century. Start to think digital and connected and try to reduce the dependency of understanding the part number in all your business activities. Move towards providing the relevant data for a user. This can be an evolution smoothening a future PLM implementation step.

 

clip_image002Looking forward to discussing this topic and many other PLM related practices with you face to face during the Product Innovation conference in Munich. I will talk about the PLM identity change and lead a focus group session about PLM and ERP integration. Looking from the high-level and working in the real world. The challenge of every PLM implementation.

 

  1. It does not make sense to define the future of PLM
  2. PLM is not an engineering solution anymore
  3. Linearity of business is faster becoming a holdback
  4. The Product in PLM is no longer a mechanical Product
  5. Planet Lifecycle Management has made a next major step

 

It does not make sense to define the future of PLM

future exitAt the beginning of this year, there was an initiative to define the future of PLM for 2025 to give companies, vendors, implementors a guidance to what is critical and needed for PLM in 2015. Have a read here: The future of PLM resides in Brussels.
I believe it is already hard to agree what has been the recognized scope of PLM in the past 10 years, how can we define the future of PLM for the next 10 years. There are several trends currently happening (see the top 5 above) that all can either be in or out of scope for PLM. It is no longer about the definition of PLM; it is dynamically looking towards how businesses adapt their product strategy to new approaches.

Therefore, I am more curious how Product Innovation platforms or Business Innovation platforms will evolve instead of focusing on a definition of what should be PLM in 2025. Have a further look here, such as, The Next Step in PLM’s Evolution: Its Platformization a CIMdata positioning paper.

Conclusion: The future is bright and challenging, let´s not fence it in by definitions.

PLM is not an engineering solution anymore

plmMore and more in all the discussions I had this year with companies looking into PLM, most of them see now PLM as a product information backbone throughout the lifecycle, providing a closed-loop of information flow and visibility across all discipline.

End-to-end visibility, End-to-end tractability, Real-time visibility were some of the buzz-words dropped in many meetings.

These words really express the change happening. PLM is no longer an engineering front-end towards ERP; PLM interacts at each stage of the product lifecycle with other enterprise systems.

End-to-end means when products are manufactured we still follow them through the manufacturing process (serialization) and their behavior in the field (service lifecycle management/field analytics).

All these concepts require companies to align in a horizontal manner, instead of investing in optimizing their silos. Platformization, as discussed above, is a logic step for extending PLM.

Conclusion: If you implement PLM now, start thinking first about the end-to-end flow of information. Or to be more concrete: Don´t be tempted to start with engineering first. It will lock your new PLM again in an extended PDM silo.

 

Linearity of business is faster becoming a holdback

changeTwo years ago I started talking about: Did you notice PLM is changing ? This topic was not in the mainstream of PLM discussions two years ago. Now with the introduction of more and more software in products (products become systems), the linear process of bringing a product to the market has become a holdback.

The market /your customers expect faster and incremental innovations/ upgrades preferably without having to invest again in a new product. If you look back, the linear product development approach has not changed since the Second World War. We automated more and more the linear process. Remember the New Product Introduction hype around 2004 -2006, where companies started to extend the engineering process with a governance process to follow a product´s introduction from its early concept phase toward a globally available product. This process is totally linear. I wrote about it in my post: from a linear world to fast and circular, where the word circular is also addressing the change of delivering products as a service instead of deliver once and scrap them.

One of my favorite presentations is from Chris Armbruster: Rethinking Business for Exponential Times – enjoy if you haven´t seen this one.

Conclusion: The past two years the discussion related to modern, data-driven dynamic products and services has increased rapidly. Now with IoT, it has become a hype to be formalized soon as life goes faster and faster.

 

The Product in Product Lifecycle Management is no longer a mechanical Product

imageI have mentioned it already in the previous point, the traditional way of working, designed and targeting a linear product development process, is no longer enough to support the product lifecycle.

When I started to implement PDM systems in the nineties, we tried to keep electrical engineering outside the scope as we had no clue how to manage their information in the context of a mechanical design. It was very rudimentary. Now PLM best practices exist to collaborate and synchronize around the EBOM in an integrated manner.

The upcoming challenge now is due to the software used in products, which turn them into systems. And not only that, software can be upgraded in a minute. So the classical ECR / ECO processes designed for hardware are creating too much overhead. Agile is the motto for software development processes. Now, we (PLM consultants/vendors) are all working on concepts and implementations where these worlds come together. PLM (Product Lifecycle Management), ALM (Asset Lifecycle Management) and SysLM (System Lifecycle Management as introduced by Prof. Martin Eigner – have a read here) are all abbreviations representing particular domains that need to flow together.

Conclusion: For most companies their products become systems with electronics and software. This requires new management and governance concepts. The challenge for all vendors & implementors.

 

Planet Lifecycle Management has made a next major step

imageFinally good news came in the beginning of December, where for the first time all countries agreed that our planet needs to have a sustainable lifecycle. Instead of the classical lifecycle from cradle to grave we want to apply a sustainable lifecycle to this planet, when it is still possible. This decision is a major breakthrough pushing us all to leave the unsustainable past behind and to innovate and work on the future. The decisions taken in Paris should be considered as a call for innovative thinking. PLM can learn from that as I wrote earlier this year in my post PLM and Global Warming

Conclusion: 2015 was a year where some new trends became clear. Trends will become commodity faster and faster. A challenge for all of us to stay connected and understand what is happening. Never has the human brain challenged before to adapt to change with such an impact.


 

thinkClosing 2015 means for me a week of quietness and stepping out of the fast lane. I wish you all a healthy 2016 with a lot of respect, compromises and changing viewpoints. The current world is too complex to solve issues by one-liners.
Take your time to think and reflect – it works!

SEE AND HEAR YOU BACK IN 2016


Topics discussed in 2014-2015

PLM Basics

PLM and Business Change

From a linear world to a circular and fast-blog

PLM and Business

Conferences

The past weeks I have discussed at various events two topics that appeared to be different:

  • The change from an analogue, document-driven enterprise towards a digital, data-driven enterprise with all its effects. E.g. see From a linear world to fast and circular?
  • The change in generations upcoming. The behavior and the attitude of the analogue generation(s) and the difference in behavior from the digital generation(s).

During PDT2015 (a review of the conference here), we discussed all the visible trends that business in exponential changing in some industries due to digitalization and every cheaper technology. The question not answered during that conference was: How are we going to make this happen in your company?

HOW ?

Last week I spoke at a PLM forum in Athens and shared with the audience the opportunities for Greece to catch-up and become a digital service economy like Singapore. Here I pictured an idealistic path how this could happen (based on an ideal world where people think long-term).

A mission impossible, perhaps.

clip_image002

The primary challenge to move from analogue towards digital is to my opinion the difference in behavior of the analogue and digital generations (and I am generalizing of course)

The analogue generation has been educated that knowledge is power. Store all you know in your head or keep it in books close to you. Your job was depending on people needing you. Those who migrated to the digital world most of the time continued the same behavior. Keep information on your hard disk or mailbox. A job was designed for life and do not plan to share as your job might come at risk. Continuous education was not part of their work pattern. And it is this generation that is in power in most of the traditional businesses.

clip_image004

The digital generation has been educated (I hope so – not sure for every country) to gather information, digest and process it and come with a result. There is no need to store information in your head as there is already an information overflow. Store in your head methodology and practices to find and interpret data. The digital generation for sure wants a stable work environment but they already grew up with the mindset that there is no job for life, having seen several crises. It is all about being flexible and keep your skills up-to-date.

So we have the dilemma here that business is moving from analogue towards digital, where the analogue business represents the linear processes that the old generation was used to. Digital business is much more an iterative approach, acting and adapting on what happens around you. A perfect match for the digital generations.

A dilemma ?

Currently the old generation is leading and they will not easy step aside due to their classical education and behavior. We cannot expect behavior to change, just because it is logically explained. In that case, everyone would stop smoking or adopt other healthy standards.

clip_image006The dilemma reminded me of the Innovators Dilemma, a famous theory from Clayton Christensen, which also could apply to analogue and digital businesses. Read more about the Innovators Dilemma here in one of my older blog posts: The Innovator´s dilemma and PLM. You can replace the incumbent with the old analogue generation and the disruptive innovation comes from using digital platforms and information understood by the digital generation. If you follow this theory, it would mean old businesses would disappear and new businesses would pop-up and overtake the old companies. Interesting conclusion, however, will there be disruption everywhere?

Recently I saw Peter Sondergaard from Gartner presenting at Gartner Symposium/ITxpo 2015 in Orlando. In his keynote speech, he talked about the value of algorithms introducing first how companies should move from their traditional analogue business towards digital business in a bimodal approach. Have a read of the press release here.

If you have the chance to view his slick and impressive keynote video (approx. 30 minutes) you will understand it better. Great presentation. In the beginning Peter talks about the bimodal approach sustaining old, slowly dying analogue businesses and meanwhile building teams developing a digital business approach. The graph below says it all.

clip_image008

Interesting from this approach is that a company can evolve without being disrupted. Still my main question remains: Who will lead this change from the old analogue business towards modern digital business approach. Will it be the old generation coaching the new generation or will there be a natural evolution at the board level required before this process starts?

HOW ?

I have no conclusion this time as I am curious to your opinion. A shift in business is imminent, but HOW will companies / countries pick-up this shift?

Your thoughts or experiences ?

PDT2015-1In this post observations from the PDT 2015 conference which took place in the IVA Conference Center, part of the Royal Swedish Academy of Engineering Services in Stockholm.

The conference was hosted by Eurostep supported by CIMdata, Airbus, Siemens Energy and Volvo AB.

For me, the PDT conference is interesting because there is a focus on architecture and standards flavored with complementary inspiring presentations. This year there were approximate 110 participants from 12 countries coming from different industries listening to 25 presentations spread over two days.

Some highlights

cimdataPeter Bilello from CIMdata kicked off the conference with his presentation: The Product Innovation Platform: What’s Missing.

Peter explained how the joined vision from CIMdata, Gartner and IDC related to a product innovation platform is growing.

The platform concept is bringing PLM to the enterprise level as a critical component to support innovation. The main challenge is to make the complex simple – easier said than done, but I agree this is the real problem of all the software vendors.

Peter showed an interesting graph based on a survey done by CIMdata, showing two trends.

  • The software and technology capabilities are closing more and more the gap with the vision (a dream can come true)
  • The gap between the implemented capabilities and the technical possible capabilities is growing too. Of course, there is a difference between the leaders and followers.

Peter described the three success factors determining if a platform can be successful:

  • Connection: how easy is it for others to connect and plug into the platform to participate as part of the platform. Translated to capabilities this requires the platform to support open standards to connect external data sources as you do not want to build new interfaces for every external source. Also, the platform provider should provide an integration API with a low entry level to get the gravity (next point)
  • Gravity: how well does the platform attract participants, both producers, and consumers. Besides a flexible and targeted user interfaces, there must be an infrastructure that allows companies to model the environment in such a manner that it supports experts creating the data, but also support consumers in data, who are not able to navigate through details and want a consumer-friendly environment.
  • Flow: how well does the platform support the exchange and co-creation of value. The smartphone platforms are extremely simple compared to a business platform as the dimension of lifecycle status and versioning is not there. A business platform needs to have support for versioning and status combined with relating the information in the right context. Here I would say only the classical PLM vendors have in-depth experience with that.

Having read these three bullet points and taking existing enterprise software vendors for PLM, ERP, and other “platforms” in mind, you see there is still a way to go before we have a “real” platform available.

According to Peter, companies should start with anchoring the vision for a business innovation platform in their strategic roadmap. It will be an incremental journey anyway. How clear the vision is connected to business execution in reality differentiates leaders and followers.

gartner

 

Next Marc Halpern from Gartner elaborated on enabling Product Innovation Platforms. Marc started to say that the platform concept is still the process of optimizing PLM.

Marc explained the functional layers making up a product innovation platform, see below

 

Gartner-platform layering

According to Marc, in 2017 the major design, PLM and business suite vendors will all offer product innovation platforms, where certain industries are more likely to implement product innovation platforms faster than others.

Marc stressed that moving to a business innovation platform is a long, but staged, journey. Each stage of the journey can bring significant value.

Gartner has a 5-step maturity model based on the readiness of the organization. Moving from reactive, repeatable, integrating towards collaborating and ultimately orchestrating companies become business ready for PDM first, next PLM and the Product Innovation Platform at the end. You cannot skip one of these steps according to Marc. I agree, PLM implementations in the past failed because the company was dreaming that the PLM system would solve the business readiness of the organization.

Marc ended with a case study and the conclusions were not rocket science.

The importance of change management, management understanding and commitment, and business and IT joined involvement. A known best practice, still we fail in many situations to act accordingly, due to underestimation of the effort. See also my recent blog post: The importance of change management for PLM.

peepoople logoNext session from Camilla Wirseen was a real revelation. Her presentation:  We are all Peepoople – innovation from the bottom of the pyramid.

She described how Anders Wilhemson, original a professor in architecture, focused on solving a global, big problem addressing 2.5 billion people in the world. These 2.5 billion persons, the poorest of the world, lack sanitation, which results in a high death rate for children (every 15 seconds a child dies because of contaminated water). Also the lack of safe places for sanitation lead to girls dropping out of school and women and children being at risk for rape when going to toilet places.

The solution is a bag, made of high-performance biodegradable plastics combined with chemicals, already in the bag, processing the feces to kill potential diseases and make the content available as fertilizer for the agricultural industry.

The plastic bag might not be new, but adding the circular possibilities to it, make it a unique approach to creating a business model providing collection and selling of the content again. For the poorest every cent they can earn makes a different.

peepoople statement

Currently in initial projects the Peepoo system has proven its value: over 95 % user acceptance. It is the establishment that does not want to introduce Peepoo on a larger scale. Apparently they never realized themselves the problems with sanitation.

Peepoo is scaling up and helping the bottom of our society. And the crazy fact is that it was not invented by engineers but by an architect. This is challenging everyone to see where you can contribute to a better world. Have a look at peepoople.cominnovation with an enormous impact!

volvologoNext Volvo Cars and Volvo Trucks presented similar challenges: How to share product data based on external collaboration. The challenge of Volvo Cars is that it has gone through different ownerships and they require a more and more flexible infrastructure to share data. It is not about data pushing to a supplier anymore, it is about integrating partners where you have to share a particular part of your IP with the partner. And where the homegrown KPD system is working well for internal execution, it was never designed for partner sharing and collaboration. Volvo Cars implemented a Shared Technology Control application outside the firewall based on Share-A-space, where inside and outside data is mapped and connected. See their summary below. A pragmatic approach which is bringing direct benefits.

clip_image002[10]

Concluding from the Volvo sessions: Apparently it ‘s hard to extend an existing system or infrastructure for secure collaboration with an external partner. The complexity of access right, different naming conventions, etc. Instead of that it is more pragmatic to have an intermediate system in the middle, like Share-A-space, that connects both worlds. The big advantage of Share-A-space is that the platform is based on the ISO 10303 (PLCS) standard and, therefore, has one of the characteristics of a real platform: openness based on standards.

awesomegroupJonas Hammerberg from the Awesome Group closed day one with an inspiring and eye-opening presentation: Make PLM – The Why and How with Gamification FUN.

Jonas started to describe the behavioral drivers new generations have based on immediate feedback for the feeling of achievement, pride and status and being in a leading environment combined with the feelings of being in a group feeling friendship, trust, and love.

Current organizations are not addressing these different behaviors, it leads to disengagement at the office / work floor as Jonas showed from a survey held in Sweden – see figure. The intrinsic motivation is missing. One of the topics that concerns me the most when seeing current PLM implementations.

engagement

The Awesome group has developed apps and plug-ins for existing software, office and PLM bring in the feelings of autonomy, mastery and purpose to the individual performing in teams. Direct feedback and stimulating team and individual performance as part of the job.

By doing so the organization also gets feedback on the behavior, activity, collaboration and knowledge sharing of individuals and how this related to their performance. An interesting concept to be implemented in situations where gamification makes sense.

clip_image002[12]Owe Lind and Magnus Lidström from Scania talked about their Remote Diagnostics approach where diagnostic readings can be received from a car through a mobile phone network either to support preventive maintenance or actual diagnostics on the road and provide support.

Interesting Owe and Magnus were not using the word IoT (Internet of Things) at all, a hype related to these capabilities. Have a look here on YouTube

clip_image002[14]There was no chance to fall asleep after lunch, where Robin Teigland from the Stockholm School of Economics took us in a whirlwind through several trends under the title: The Third Revolution – exploring new forms of value creation through doing more with less.

The decomposition of traditional business into smaller and must faster communities undermine traditional markets. Also concepts like Uber, Bitcoin becoming a serious threat. The business change as a result of connectivity and communities leading to more and more networks of skills bringing together knowledge to design a car (Local Motors), funding (Kickstarter) – and it is all about sharing knowledge instead of keeping it inside – sharing creates the momentum in the world. You can look at Robin’s presentation(s) at Slideshare here.

future quote

All very positive trends for the future, however, a big threat to the currently established companies. Robin named it the Third Revolution which is in line with what we are discussing in our PLM world, although some of us call it even the Fourth Revolution (Industry 4.0).

image

EignerProfessor Martin Eigner from the Technical University of Kaiserslautern brought us back to reality in his presentation: Industry 4.0 or Industrial Internet: What is the impact for PLM?

Martin stood at the base for what we call PLM and already for several years he is explaining to us that the classical definition for PLM is too narrow. More and more we are developing systems instead of products. Therefore, he prefers the abbreviation SysLM, which is more than 3 characters and therefore probably hard to accept by the industry.

PDMtoSysLM

System development and, therefore, multidisciplinary development of systems introduces a new complexity. Traditional change management for Mechanical CAD (ECO/ECR) is entirely different from how software change management is handled (baselines / branches related to features). The way systems are designed, require a different methodology where systems engineering is an integral part of the development process, see Model-Based Systems Engineering (MBSE).

Next Martin discussed 4 potential IT-architectures where, based on the “products” and business needs, a different balance of PLM, ALM or ERP activities is required.

Martin’s final point was about the need for standards support these architectures, bringing together OSLC, PCLS, etc.
Standards are necessary for fast and affordable integrations and data exchange.

imageMy presentation: The Perfect Storm or a fatal Tsunami was partly summarizing topics from the conference and, in addition, touching on two topics.

The first topic is related to big data and analytics.  Many are trying to get a grip on big data with analytics. However, the real benefit of big data comes when you are able to apply algorithms to it. Gartner just made an interesting statement related to big data (below) and Marc Halpern added to this quote that there is an intrinsic need for data standards in order to apply algorithms.

Gartner algorithms

When algorithms can be used, classical processes like ECO, ECR or managers might become obsolete and even a jobs like an accountant is at risk. This as predicted in article in the Economist in February 2014 – the onrushing Wave

The second topic, where I believe we are still hesitating too long at management level, is making decisions, to anticipate the upcoming digital wave and all of its side effects. We see a huge wave coming. If we do not mobilize the people, this wave might be a tsunami for those still at the seaside

Conclusion: PDT2015 was an inspiring, well-balanced conference with excellent opportunity to network with all people attending. For those interested in the details of the PLM future and standards an ideal opportunity to get up to date. And next the challenge: Make it happen at your company!

.. if you reach this point, my compliments for your persistency to read it all. Too long for a blog post and even here I had to strip

 

linkedinThis is a post I published on LinkedIn on July 28th related to a discussion around Excel and PLM usage and usability.
Reposted for my blog subscribers.

collaboration

This post is written in the context of two posts that recently caught my attention. One post from Lionel Grealou – comparing PLM and Excel collaboration and reaction on this post and its comments by Oleg Shilovitsky – PLM Need for speed.

Both posts discuss the difference between Excel (easy to use / easy to deploy ) and a PLM system (complex to use / complicated deployment). And when you read both posts you would believe that it is mainly deployment and usability that are blocking PLM systems to be used instead of Excel.

Then I realized this cannot be the case. If usability and deployment were blocking issues for an enterprise system, how would it be possible that the most infamous system for usability, SAP, it one of the top-selling enterprise applications. Probably SAP is the best-selling enterprise application. In addition, I have never heard about any company mentioning SAP is easy to deploy. So what is the difference?

I assume if Excel had existed in its current state in the early days of MRP, people might be tempted to use Excel for some ERP functions. However they would soon realize that Excel is error prone and when you buy the wrong materials or when make errors in your resource scheduling, soon you would try to solve it in a more secure way. Using an ERP system.

ERP systems have never been sold to the users for their usability. It is more that the management is looking for guarantees that the execution process is under control. Minimize the potential for errors and try to automate all activities as much as possible. As the production process is directly linked to finance, it is crucial to have it under control. Goodbye usability, safety first.

Why is this approach not accepted for PLM?
Why do we talk about usability?

First of all, the roots for PLM come from the engineering department (PDM) and, therefore, their primary data management system was not considered an enterprise system. And when you implement a system for a department, discussions will be at the user level. So user acceptance became necessary for PDM and PLM.

But this is not the main reason. Innovation, Product Development, Sales Engineering, Engineering are all iterative activities. In contrary to ERP, there is no linear process defined how to develop the ultimate product the first time right. Although this believe existed in the nineties by an ERP country manager that I met that time. He told me

“Engineers are resources that do not want to be managed, but we will get them.”

An absurd statement I hope you agree. However, the thoughts behind this statement are correct. How do you make sure product development is done in the most efficient manner?

If you look at large enterprises in the aerospace or automotive industry, they implemented PLM, which for sure was not user-friendly. Why did they implement PLM? As they did not want to fix the errors, an Excel-like implementation would bring.

Using Excel has a lot of hidden costs. How to make sure you work with the right version as multiple copies exist? How do you know if the Excel does not contain any type indicating wrong parts? You will learn this only once it is too late. How do you understand the related information to the Excel (CAD files, specifications, etc., etc.)? All lead to a lot of extra manual work depending on the accuracy and discipline of every employee in the company. Large enterprises do not want to be dependent on individual skills.

Large enterprise have shown that it is not about usability in the first place if you wish to control the data. Like for ERP systems, they are aware of the need for PLM with reduced usability above being (fl)Exel with all its related inconvenience.

I believe when there is a discussion about PLM or Excel, we have not reached the needed conceptual level to implement PLM. PLM is about sharing data and breaking down silos. Sharing allows better and faster collaboration, maintaining quality, and this is what companies want to achieve. Therefore the title: How do you measure collaboration. This is the process you wish to optimize, and I suspect that when you would compare user-friendly collaboration with Excel with less user-friendly PLM, you might discover PLM is more efficient.

Therefore stop comparing Excel and PLM. It is all about enabling collaboration and changing people to work together (the biggest challenge – more than usability).

Conclusion: Once we have agreed on that concept, PLM value is about collaboration, there is always to hope to enhance usability. Even SAP is working on that – it is an enterprise software issue.

classificationIn my previous post describing the various facets of the EBOM, I mentioned several times classification as an important topic related to the PLM data model. Classification is crucial to support people to reuse information and, in addition, there are business processes that are only relevant for a particular class of information, so it is not only related to search/reuse support.

In 2008, I wrote a post about classification, you can read it here. Meanwhile, the world has moved on, and I believe more modern classification methods exist.

Why classification ?

searchFirst of all classification is used to structure information and to support retrieval of the information at a later moment, either for reuse or for reference later in the product lifecycle. Related to reuse, companies can save significant money when parts are reused. It is not only the design time or sourcing time that is reduced. Additional benefits are lower risks for errors (fewer discoveries), reduced process and approval time (human overhead), reduced stock (if applicable), and more volume discount (if applicable) and reduced End-Of-Life handling.

An interesting discussion about reuse started by Joe Barkai can also be found on LinkedIn here, including interesting comments

Classification can also be used to control access to certain information (mainly document classification), or classification can be used to make sure certain processes are followed, e.g. export control, hazardous materials, budget approvals, etc. Although I will speak mainly about part classification in this post, classification can be used for any type of information in the PLM data model.

Classification standards

din4000Depending on the industry you are working in, there are various classification standards for parts. When I worked in the German-speaking countries (the DACH-länder) the most discussed classification at that time was DIN4000 (Sachmerkmal-liste), a must have standard for many of the small and medium sized manufacturing companies. The DIN 4000 standard had a predefined part hierarchy and did not describe the necessary properties per class. I haven’t met a similar standard in other countries at that time.

Another very generic classification I have seen are the UNSPC standard, again a hierarchical classification supporting everything in the universe but no definition of attributes.

15926Other classification standards like ISO13399, RosettaNET, ISO15926 and IFC exist to support collaboration and/or the supply chain. When you want to exchange data with other disciplines or partners. The advantage of a standard definition (with attributes) is that you can exchange data with less human processing (saving labor costs and time – the benefit of a digital enterprise).

I will not go deeper into the various standards here as I am not the expert for all the standards. Every industry has its own classification standards, a hierarchical standard, and if more advanced the hierarchy is also supported by attributes related to each class. But let´s go into the data model part.

Classification and data model

clip_image002The first lesson I learned when implementing PLM was that you should not build your classification hard-coded into the PLM, data model. When working with SmarTeam is was very easy to define part classes and attributes to inherit. Some customers had more than 300 classes represented in their data model just for parts. You can imagine that it looks nice in a demo. However when it comes to reality, a hard-coded classification becomes a pain in the model. (left image, one of the bad examples from the past)

1 – First of all, classification should be dynamic, easy to extend.

2 – The second problem however with a hard-coded classification was that once a part is defined for the first time the information object has a fixed class. Later changes need a lot of work (relinking of information / approval processes for the new information).

3 – Finally, the third point against a hard-coded classification is that it is likely that parts will be classified according to different classifications at the same time. The image bellow shows such a multiple classification.

multiclass

So the best approach is to have a generic part definition in your data model and perhaps a few subtypes. Companies tend to differentiate still between hardware (mechanical / electrical) parts and software parts.

Next a part should be assigned at least to one class, and the assignment to this class would bring more attributes to the part. Most of the PLM systems that support classification have the ability to navigate through a class hierarchy and find similar parts.

When parts are relevant for ERP they might belong to a manufacturing parts class, which add particular attributes required for a smooth PLM – ERP link. Manufacturing part types can be used as templates for ERP to be completed.

This concept is also shared by Ed Lopategui as commented to my earlier post about EBOM Part types. Ed states:

Think part of the challenge moving forward is we’ve always handled these as parts under different methodologies, which requires specific data structures for each, etc. The next gen take on all this needs to be more malleable perhaps. So there are just parts. Be they service or make/buy or some combination – say a long lead functional standard part and they would acquire the properties, synchronizations, and behaviors accordingly. People have trouble picking the right bucket, and sometimes the buckets change. Let the infrastructure do the work. That would help the burden of multiple transitions, where CAD BOM to EBOM to MBOM to SBOM eventually ends up in a chain of confusion.

I fully agree with his statement and consider this as the future trend of modern PLM: Shared data that will be enriched by different usage through the lifecycle.

Why don’t we classify all data in PLM?

There are two challenges for classification in general.

  • The first one is that the value of classification only becomes visible in the long-term, and I have seen several young companies that were only focusing on engineering. No metadata in the file properties, no part-centric data management structure and several years later they face the lack of visibility what has been done in the past. Only if one of the engineers remembers a similar situation, there is a chance of reuse.
  • The second challenge is that through a merger or acquisition suddenly the company has to manage two classifications. If the data model was clean (no hard-coded subclasses) there is hope to merge the information together. Otherwise, it might become a painful activity to discover similarities.

SO THINK AHEAD EVEN IF YOU DO NOT SEE THE NEED NOW !

Modern search based applications

There are ways to improve classification and reuse by using search-based application which can index archives and try to find similarity in properties / attributes. Again if the engineers never filled the properties in the CAD model, there is little to nothing to recover as I experienced in a customer situation. My PLM US peer, Dick Bourke, wrote several articles about search-based applications and classification for engineering.com, which are interesting to read if you want to learn more: Useful Search Applications for Finding Engineering Data

So much to discuss on this topic, however I reached my 1000 words again Sad smile

Conclusion

Classification brings benefits for reuse and discovery of information although benefits are long-term. Think long-term too when you define classifications. Keep the data model simple and add attributes groups to parts based on functional classifications. This enables a data-driven PLM implementation where the power is in the attributes not longer in the part number. In the future, search-based applications will offer a quick start to classify and structure data.

 

imageSomeone notified me that not everyone subscribed to my blog necessary will read my posts on LinkedIn. Therefore I will repost the upcoming weeks some of my more business oriented posts from LinkedIn here too. This post was from July 3rd and an introduction to all the methodology post I am currently publishing.

image

The importance of a (PLM) data model

thinkWhat makes it so hard to implement PLM in a correct manner and why is this often a mission impossible? I have been asking myself this question the past ten years again and again. For sure a lot has to do with the culture and legacy every organization has. Imagine if a company could start from scratch with PLM. How would they implement PLM nowadays?

My conclusion for both situations is that it all leads to a correct (PLM) data model, allowing companies to store their data in an object-oriented manner. In this way reflecting the behavior the information objects have and the way they mature through their information lifecycle. If you making compromises here, it has an effect on your implementation, the way processes are supported out-of-the-box by a PLM system or how information can be shared with other enterprise systems, in particular, ERP. PLM is written between parenthesis as I believe in the future we do not talk PLM or ERP separate anymore – we will talk business.

Let me illustrate this academic statement.

A mid-market example

imageWhen I worked with SmarTeam in the nineties, the system was designed more as a PDM system than a PLM system. The principal objects were Projects, Documents, and Items. The Documents had a sub-grouping in Office documents and CAD documents. And the system had a single lifecycle which was very basic and designed for documents. Thanks to the flexibility of the system you could quickly implement a satisfactory environment for the engineering department. Problems (and customizations) came when you wanted to connect the data to the other departments in the company.

The sales and marketing department defines and sells products. Products were not part of the initial data model, so people misused the Project object for that. To connect to manufacturing a BOM (Bill of Material) was needed. As the connected 3D CAD system generated a structure while saving the assemblies, people start to consider this structure as the EBOM. This might work if your projects are mechanical only.

However, a Document is not the same as a Part. A Document has a complete different behavior as a Part. Documents have continuous iterations, with a check-in/checkout mechanism, where the Part definition remains unchanged and gets meanwhile a higher maturity.

The correct approach is to have the EBOM Part structure, where Part connect to the Documents. And yes, Documents can also have a structure, but it is not a BOM. SmarTeam implemented this around 2004. Meanwhile, a lot of companies had implemented their custom solution for EBOM by customization not matching this approach. This created a first level of legacy.

When SmarTeam implemented Part behavior, it became possible to create a multidisciplinary EBOM, and the next logical step was, of course, to connect the data to the ERP system. At that time, most implementations have been pushing the EBOM to the ERP system and let it live there further. ERP was the enterprise tool, SmarTeam the engineering tool. The information became disconnected in an IT-manner. Applying changes and defining a manufacturing BOM was done manually in the ERP system and could be done by (experienced) people that do not make mistakes.

Next challenge comes when you want to automate the connection to ERP. In that case, it became apparent that the EBOM and MBOM should reside in the same system. (See old and still actual post with comments here: Where is the MBOM) In one system to manage changes and to be able to implement these changes quickly without too much human intervention. And as the EBOM is usually created in the PLM system, the (commercial/emotional) PLM-ERP battle started. “Who owns the part definition”, “Who owns the MBOM definition” became the topic of many PLM implementations. The real questions should be: “Who is responsible for which attributes of the Part ?” and “Who is responsible for which part of the MBOM definition ?” as data should be shared not owned.

The SmarTeam evolution shows how a changing scope and an incomplete/incorrect data model leads to costly rework when aligning to the mainstream. And this is happening with many implementation and other PLM systems. In particular when the path is to grow from PDM to PLM. An important question remains what is going to be mainstream in the future. More on that in my conclusion.

A complex enterprise example

flexibleIn the recent years, I have been involved in several PLM discussions with large enterprises. These enterprises suffer from their legacy. Often the original data management was not defined in an object-oriented manner, and the implementation has been expanding with connected and disconnected systems like a big spaghetti bowl.

The main message most of the time is:

“Don’t touch the systems it as it works for us”.

The underlying message is;

“We would love to change to a modern approach, but we understand it will be a painful exercise and how will it impact profitability and execution of our company”

The challenge these companies have is that it extremely hard to imagine the potential to-be situation and how it is affected by the legacy. In a project that I participated several years ago the company was migrating from a mainframe database towards a standard object-oriented (PLM) data model. The biggest pain was in mapping data towards the object-oriented data model. As the original mainframe database had all kind of tables with flags and mixed Part & Document data, it was almost impossible to make a 100 % conversion. The other challenge was that knowledge of the old system had vaporized. The result at the end was a customized PLM data model, closer to current reality, still containing legacy “tricks” to assure compatibility.

All these enterprises at a particular time have to go through such a painful exercise. When is the best moment? When business is booming, nobody wants to slow-down. When business is in a lower gear, costs and investments are minimized to keep the old engine running efficiently. I believe the latter would be the best moment to invest in making the transition if you believe your business will still exist in 10 years from now.

Back to the data model.

Businesses should have today a high-level object-oriented data model, describing the main information objects and their behavior in your organization. The term Master Data Management is related to this. How many companies have the time and skills to implement a future-oriented data model? And the data model must stay flexible for the future.

knowledgeCompare it to your brain, which also stores information by its behavior and by learning the brain understands what it logically related. The internal data model gets enriched while we learn.

Once you have a business data model, you are able to implement processes on top of it. Processes can change over time, therefore, avoid hard-coding specific processes in your enterprise systems. Like the brain, we can change our behavior (applying new processes) still it will be based on the data model stored inside our brain.

Conclusion:

A lot of enterprise PLM implementations are in a challenging situation due to legacy or incomplete understanding and availability of an enterprise data model. Therefore cross-department implementations and connecting others systems are considered as a battle between systems and their proprietary capabilities.

image

The future will be based on business platforms and realizing this take years – imagine openness and usage of data standards. An interesting conference to attend in the near future for this purpose is the PDT2015 conference in Stockholm.

Meanwhile I also learned that a  one-day Master Data Management workshop will be held before the PDT2015 conference starts on the 12th of October. A good opportunity to deep-dive for three days !

In my series of blog posts related to the (PLM) data model, I talked about Product, BOMs and Parts. This time I want to focus on the EBOM and (CAD) Documents relation. This topic became relevant with the introduction of 3D CAD.

Before companies were using 3D CAD systems, there was no discussion about EBOM or MBOM (to my knowledge). Engineering was producing drawings for manufacturing and not every company was using the mono-system (for each individual part a specifying drawing). Drawings were mainly made to assist production and making a drawing for an individual part was a waste of engineering time. Parametric drawings were used to specify similar parts. But now we are in the world of 3D!

imageWith the introduction of 3D CAD systems for the mainstream in the nineties (SolidWorks, Solid Edge, Inventor) there came a need for PDM systems managing the individual files from a CAD assembly. The PDM system was necessary to manage all the file versions. Companies that were designing simple products sometimes remained working file-based, introducing the complexity of how to name a file and how to deal with revisions. Ten years ago I was investigating data management for the lower tiers of the automotive supply chain. At that time still 60 % of the suppliers were using CATIA were working file-based. Data management was considered as an extra complexity still file version control was a big pain.

This has changed for several reasons:

  • More and more OEMs were pushing for more quality control of the design data (read PDM)
  • Products became more modular, which means assemblies can be used as subassemblies in other products, pushing the need for where used control
  • Products are becoming more complex and managing only mechanical CAD files is not enough anymore – Electronics & Software – mechatronics – became part of the product

Most PDM systems at that time (I worked with SmarTeam) were saving the 3D CAD structure as a quantity-based document structure, resembling a lot a structure called the EBOM.

CAD DOC structure

 

This is one of the most common mistakes made in PLM implementations.

The CAD structure does not represent the EBOM !!!

Implementers started to build all kind of customizations to create automatically from the CAD structure a Part structure, the EBOM. Usually these customizations ended up as a mission impossible, in particular when customers started to ask for bidirectional synchronization. They expected that when a Part is removed in the EBOM, it would be deleted in the CAD assembly too.

And then there was the issue that companies believed the CAD Part ID should be equal to the Part ID. This might be possible for a particular type of design parts, but does not function anymore with flexible parts, such as a tube or a spring. When this Part is modeled in a different position, it created a different CAD Document, breaking the one-to-one relation.

Finally another common mistake that I have seen in many PDM implementations is the addition of glue, paint and other manufacturing type of parts to the CAD model, to be able to generate a BOM directly from the CAD.

imageFrom the data model perspective it is more important to understand that Parts and CAD documents are different type of objects. In particular if you want to build a PLM implementation where data is shared across all disciplines. For a PDM implementation I care less about the data model as the implementation is often not targeting enterprise continuity of data but only engineering needs.

A CAD Document (Assembly / Part / Drawing / …) behaves like a Document. It can be checked-in and checked out any time a change is made inside the file. A check-in operation would create a new version of the CAD Document (in case you want to trace the history of changes).

Meanwhile the Part specified by the CAD Document does not change in version when the CAD Document is changed. Parts usually do not have versions; they remain in the same revision as long as the specifying CAD Document matures.

Moving from PDM to PLM

For a PLM implementation it is important to think “Part-driven” which means from an initial EBOM, representing the engineering specification of the Product, maturing the EBOM with more and more design specification data. Design specification data can be mechanical assemblies and parts, but also electrical parts. The EBOM from a PCB might come from the Electrical Design Application as in the mechanical model you will not create every component in 3D.

And once the Electrical components are part of the EBOM, also the part definition of embedded software can be added to the BOM. For example if software is needed uploaded in flash memory chips. By adding electrical and software components to the EBOM, the company gets a full overview of the design maturity of ALL disciplines involved.

The diagram below shows how an EBOM and its related Documents could look like:

EBOM.docs

 

This data model contains a lot of details:

  • As discussed in my previous post – for the outside world (the customer) there is a product defined without revision
  • Related to the Product there is an EBOM (Part assembly) simplified as a housing (a mechanical assembly), a connector (a mechanical art) and a PCB (a mechanical representation). All these parts behave like Mechanical Parts; they have a revision and status.
  • The PCB has a second representation based on an electrical schema, which has only (for simplification) two electrical parts, a resistor and a memory chip. As you can see these components are standard purchasable parts, they do not have a revision as they are not designed.
  • The Electrical Part Flash Memory has a relation to a Software Part which is defined by Object Code (a zip-file?) which of course is specified by a software specification (not in the diagram). The software object code has a version, as most of the time software is version managed, as it does not follow the classical rules of mechanical design.

Again I reached my 1000 words, a sign to stop explaining this topic. For sure there are a lot of details to explain to this data model part too.

Most important:

  • A CAD structure is not an EBOM (it can be used to generate a part of the EBOM)
  • CAD documents and EBOM parts have a different behavior. CAD documents have versions, Parts do not have versions (most of the time
  • The EBOM is the place where all disciplines synchronize their data, providing during the development phase a single view of the design status.

Let me know if this was to abstract and feel free to ask questions. Important for this series of blog post is to provide a methodology baseline for a real PLM data model.

I am looking forward to your questions or remarks to spark up the discussion.

image

As described in my latest LinkedIn post if you want to install PLM successful there are two important points to address from the implementation point of view:

  • An explicit data model not based on system or tools capabilities, but on the type of business the company is performing. There is a difference in an engineering to order company, a built to order company or a configure to order company.
  • In PLM (and Business) it is all about enabling an efficient data flow through the organization. There is no ownership of data. It is about responsibilities for particular content per lifecycle stage combined with sharing

Historically PLM implementations started with capturing the CAD data and related EBOM as this is what the CAD-related PLM vendors were pushing for and this was often for the engineering department the biggest pain. The disadvantage of this approach is that it strengthens the silo-thinking process. The PLM system becomes an engineering tool instead of an enterprise system.

I believe if you really want to be able to implement PLM successful in a company, start from a common product/part information backbone. This requires the right business objects and, therefore, the right data modeling. The methodology described below is valid for build to order and configure to order companies, less applicable for engineering to order.

BusinessModels

In a build to order company there are the following primary information objects:

  • A Product ( representing the customer view of what is sold to the outside world)
  • An EBOM ( representing a composition of Parts specifying the Product at a particular time)
  • An MBOM (representing the manufacturing composition of the Product at a given time)

And, of course, there are for all the information objects related Documents. Various types and when you can work more advanced, the specification document, can be the source for individually extracted requirements (not in this post)

Let´s follow an End to End scenario from a typical Build to Order company process.

Quoting phase

A potential customer sends an RFP for a product they need. The customer RFP contains information about how the product should behave (Specification / Requirements) and how it should be delivered (packaging). A basic data model for this RFP would be:

DataModel-1

Note the following details:

  • All information objects have a meaningless number. The number is only there to support unique identification and later integration with other systems. The meaning should come from the other attribute data on the object and its relations. (A blog post on its own)
  • The Product can have instead of the meaningless number the number provided by the customer. However, if this number is not unique to the company, it might be just another attribute of the product
  • In general Products do not have revisions. In time, there might be other BOMs related to the product. Not in this post, products might have versions and variants. And products might be part of a product family. In this case, I used a classification to define a classification code for the product, allowing the company to discover similar products from different customers done. This to promote reuse of solutions and reuse of lessons learned.
  • The customer object represents the customer entity and by implementing it as a separate object, you will be able to see all information related to this customer quickly. This could be Products (ordered / in RFQ / etc.) but also other relevant information (Documents, Parts, …)
  • The initial conceptual BOM for the customer consists of two sub-BOMs. As the customer wants the products to be delivered in a 6-pack, a standard 6-pack EBOM is used. Note: the Status is Released and a new conceptual EBOM is defined as a placeholder for the BOM definition of the Product to design/deliver.
  • And for all the Parts in the conceptual EBOM there can be relations towards one or more documents. Usually, there is one specifying document (the CAD model) and multiple derived documents (Drawings, Illustrations, …)
  • Parts can have a revision in case the company wants to trace the evolution of a Part. Usually when Form-Fit-Function remains the same, we speak about a revision. Otherwise, the change will be a new part number. As more and more the managed information is no longer existing on the part number, companies might want to use a new part number at any change, storing in an attribute what its predecessor was.
  • Documents have versions and revisions. While people work on a document, every check-in / check-out moment can create a new version of the file(s), providing tractability between versions. Most of the time at the end there will be a first released version, which is related to the part specified.
  • Do not try to have the same ID and Revision for Parts and Documents. In the good old days of 2D drawings this worked, in the world of 3D CAD this is not sustainable. It leads to complexity for the user. Preferably the Part and the specifying Document should have different IDs and a different revision mechanism.

And the iterations go on:

Now let´s look at the final stage of the RFQ process. The customer has requested to deliver the same product also in single (luxury) packaging as this product will be used for service. Although it is exactly the same physical product to produce, the product ID should be different. If the customer wants unambiguous communication, they should also use a different product ID when ordering the product for service or for manufacturing. The data model for this situation will look as follows (assuming the definitions are done)

DataModel-2

Note the following details:

  • The Part in the middle (with the red shadow) – PT000123 represents the same part for both, the product ordered for manufacturing, as well as the product ordered for service, making use of a single definition for both situations
  • The Part in the middle has now a large set of related documentation. Not only CAD data but also test information (how to test the product), compliance information and more.
  • The Part in the middle on its own also has a deeper EBOM structure which we will explore in an upcoming post.

I reached my 1000 words and do not want to write a book. So I will conclude this post. For experienced PLM implementers probably known information. For people entering the domain of PLM, either as a new student or coming from a more CAD/PDM background an interesting topic to follow. In the next post, I will continue towards the MBOM and ERP.

Let me know if this post is useful for you – and of course – enhancements or clarifications are always welcomed. Note: some of the functionality might not be possible in every PLM system depending on its origin and core data model

Follow

Get every new post delivered to your Inbox.

Join 466 other followers

%d bloggers like this: