You are currently browsing the category archive for the ‘Education’ category.

PDT Europe is over, and it was this year a surprising aligned conference, showing that ideas and concepts align more and more for modern PLM. Håkan Kårdén opened the conference mentioning the event was fully booked, about 160 attendees from over 19 countries. With a typical attendance of approx. 120 participants, this showed the theme of the conference: Continuous Transformation of PLM to support the Lifecycle Model-Based Enterprise was very attractive and real. You can find a history of tweets following the hashtag #pdte17

Setting the scene

Peter Bilello from CIMdata kicked-off by bringing some structure related to the various Model-Based areas and Digital Thread. Peter started by mentioning that technology is the least important issue as organization culture, changing processing and adapting people skills are more critical factors for a successful adoption of modern PLM. Something that would repeatedly be confirmed by other speakers during the conference.

Peter presented a nice slide bringing the Model-Based terminology together on one page. Next, Peter took us through various digital threads in the different stages of the product lifecycle. Peter concluded with the message that we are still in a learning process redefining optimal processes for PLM, using Model-Based approaches and Digital Threads and thanks (or due) to digitalization these changes will be rapid. Ending with an overall conclusion that we should keep in mind:


It isn’t about what we call digitalization; It is about delivering value to customers and all other stakeholders of the enterprise

Next Marc Halpern busted the Myth of Digital Twins (according to his session title) and looked into realistic planning them. I am not sure if Marc smashed some of the myths although it is sure Digital Twin is at the top of the hype cycle and we are all starting to look for practical implementations. A digital twin can have many appearances and depends on its usage. For sure it is not just a 3D Virtual model.

There are still many areas to consider when implementing a digital twin for your products. Depending on what and how you apply the connection between the virtual and the physical model, you have to consider where your vendor really is in maturity and avoid lock in on his approach. In particular, in these early stages, you are not sure which technology will last longer, and data ownership and confidentially will play an important role. And opposite to quick wins make sure your digital twin is open and use as much as possible open standards to stay open for the future, which also means keep aiming for working with multiple vendors.

Industry sessions

Next, we had industry-focused sessions related to a lifecycle Model-Based enterprise and later in the afternoon a session from Outotec with the title: Managing Installed Base to Unlock Service opportunities.

The first presentation from Väino Tarandi, professor in IT in Construction at KTH Sweden presented his findings related to BIM and GIS in the context of the lifecycle, a test bed where PLCS meets IFC. Interesting as I have been involved in BIM Level 3 discussions in the UK, which was already an operational challenge for stakeholders in the construction industry now extended with the concept of the lifecycle. So far these projects are at the academic level, and I am still waiting for companies to push and discover the full benefits of an integrated approach.

Concepts for the industrial approach could be learned from Outotec as you might understand later in this post. Of course the difference is that Outotec is aiming for data ownership along the lifecycle, where in case of the construction industries, each silo often is handled by a different contractor.

Fredrik Ekström from Swedish Transport Administration shared his challenges of managing assets for both road and railway transport – see image on the left. I have worked around this domain in the Netherlands, where asset management for infrastructure and asset management for the rail infrastructure are managed in two different organizations. I believe Fredrik (and similar organizations) could learn from the concepts in other industries. Again Outotec’s example is also about having relevant information to increase service capabilities, where the Swedish Transport Administration is aiming to have the right data for their services. When you look at the challenges reported by Fredrik, I assume he can find the answers in other industry concepts.

Outotec’s presentation related to managing installed base and unlock service opportunities explained by Sami Grönstrand and Helena Guiterrez was besides entertaining easy to digest content and well-paced. Without being academic, they explained somehow the challenges of a company with existing systems in place moving towards concepts of a digital twin and the related data management and quality issues. Their practical example illustrated that if you have a clear target, understanding better a customer specific environment to sell better services, can be achieved by rational thinking and doing, a typical Finish approach. This all including the “bi-modal approach” and people change management.

Future Automotive

Ivar Hammarstadt, Senior Analyst Technology Intelligence for Volvo Cars Corporation entertained us with a projection toward the future based on 160 years of automotive industry. Interesting as electrical did not seem to be the only way to go for a sustainable future depending on operational performance demands.

 

Next Jeanette Nilsson and Daniel Adin from Volvo Group Truck shared their findings related to an evaluation project for more than one year where they evaluated the major PLM Vendors (Dassault Systemes / PTC / Siemens) on their Out-of-the-box capabilities related to 3D product documentation and manufacturing.

They concluded that none of the vendors were able to support the full Volvo Truck complexity in a OOTB matter. Also, it was a good awareness project for Volvo Trucks organization to understand that a common system for 3D geometry reduces the need for data transfers and manual data validation. Cross-functional iterations can start earlier, and more iterations can be performed. This will support a shortening of lead time and improve product quality. Personally, I believe this was a rather expensive approach to create awareness for such a conclusion, pushing PLM vendors in a competitive pre-sales position for so much detail.

Future Aerospace

Kenny Swope from Boeing talked us through the potential Boeing journey towards a Model-Based Enterprise. Boeing has always been challenging themselves and their partners to deliver environments close to what is possible. Look at the Boeing journey and you can see that already in 2005 they were aiming for an approach that most of current manufacturing enterprises cannot meet. And now they are planning their future state.

To approach the future state Boeing aims to align their business with a single architecture for all aspects of the company. Starting with collecting capabilities (over 400 in 6 levels) and defining value streams (strategic/operational) the next step is mapping the capabilities to the value streams.  Part of the process would be to look at the components of a value stream if they could be fulfilled by a service. In this way you design your business for a service-oriented architecture, still independent from any system constraints. As Kenny states the aerospace and defense industry has a long history and therefore slow to change as its culture is rooted in the organization. It will be interesting to learn from Kenny next hear how much (mandatory) progress towards a model-based enterprise has been achieved and which values have been confirmed.

Gearing up for day 2

Martin Eigner took us in high-speed mode through his vision and experience working in a bi-modular approach with Aras to support legacy environments and a modern federated layer to support the complexity of a digital enterprise where the system architecture is leading. I will share more details on these concepts in my next post as during day 2 of PDT Europe both Marc Halpern and me were talking related to this topic, and I will combine it in a more extended story.

The last formal presentation for day one was from Nigel Shaw from Eurostep Ltd where he took us through the journey of challenges for a model-based enterprise. As there will not be a single model that defines all, it will be clear various models and derived models will exist for a product/system.  Interesting was Nigel’s slide showing the multiple models disciplines can have from an airplane (1948). Similar to the famous “swing” cartoon, used to illustrate that every single view can be entirely different from the purpose of the product.

Next are these models consistent and still describing the same initial specified system. On top of that, even the usage of various modeling techniques and tools will lead to differences in the system. And the last challenge on top is managing the change over the system’s lifecycle. From here Nigel stepped into the need for digital threads to govern relations between the various views per discipline and lifecycle stage, not only for the physical and the virtual twin.  When comparing the needs of a model-based enterprise through its lifecycle, Nigel concluded that using PLCS as a framework provides an excellent fit to manage such complexity.

Finally, after a panel discussion, which was more a collection of opinions as the target was not necessary to align in such a short time, it was time for the PDT dinner always an excellent way to share thoughts and verify them with your peers.

Conclusion

Day 1 was over before you knew it without any moment of boredom and so I hope is also this post. Next week I will close reviewing the PDT conference with some more details about my favorite topics.

 

Advertisements

At this moment there are two approaches to implement PLM. The most common practice is item-centric and model-centric will be potentially the best practice for the future. Perhaps your company still using a method from the previous century called drawing-centric. In that case, you should read this post with even more attention as there are opportunities to improve.

 

The characteristics of item-centric

In an item-centric approach, the leading information carrier is an item also known as a part. The term part is sometimes confusing in an organization as it is associated with a 3D CAD part. In SAP terminology the item is called Material, which is sometimes confusing for engineering as they consider Material the raw material. Item-centric is an approach where items are managed and handled through the whole lifecycle. In theory, an item can be a conceptual item (for early estimates), a design item (describing the engineering intent), a manufacturing item (defining how an item is consumed) and potentially a service item.

The picture below illustrates the various stages of an item-centric approach. Don’t focus on the structure, it’s an impression.

It is clear these three structures are different and can contain different item types. To read more about the details for an EBOM/MBOM approach read these post on my blog:

Back to item-centric. This approach means that the item is the leading authority of the product /part. The id and revision describe the unique object in the database, and the status of the item tells you in the current lifecycle stage for the item. In some cases, where your company makes configurable products also the relation between two items can define effectivity characteristics, like data effectivity, serial number effectivity and more. From an item structure, you can find its related information in context. The item points to the correct CAD model, the assembly or related manufacturing drawings, the specifications. In case of an engineering item, it might point towards approved manufacturers or approved manufacturing items.

Releasing an item or a BOM means the related information in context needs to validated and frozen too. In case your company works with drawings for manufacturing, these drawings need to be created, correct and released, which sometimes can be an issue due to some last-minute changes that can happen. The above figure just gives an impression of the potential data related to an item. It is important to mention that reports, which are also considered documents, do not need an approval as they are more a snapshot of the characteristics at that moment of generation.

The advantages of an item-centric approach are:

  • End-to-end traceability of information
  • Can be implemented in an evolutionary approach after PDM-ERP without organizational changes
  • It enables companies to support sharing of information
  • Sharing of information forces companies to think about data governance
    (not sure if a company wants to invest on that topic)

The main disadvantages of an item-centric approach are:

  • Related information on the item is not in context and therefore requires its own management and governance to ensure consistency
  • Related information is contained in documents, where availability and access is not always guaranteed

Still, the item-centric approach brings big benefits to a company that was working in a classical drawing-driven PDM-ERP approach. An additional remark needs to be made that not every company will benefit from an item-centric approach as typically Engineering-to-Order companies might find this method creating too much overhead.

The characteristics of Model-Centric

A model-centric approach is considered the future approach for modern enterprises as it brings efficiency, speed, multidisciplinary collaboration and support for incremental innovation in an agile way. When talking about a model-centric approach, I do not mean a 3D CAD model-centric approach. Yes, in case the product is mature, there will be a 3D Model serving as a base for the physical realization of the product.

However, in the beginning, the model can be still a functional or logical model. In particular, for complex products, model-based systems engineering might be the base for defining the solution. Actually, when we talk about products that interact with the outside world through software, we tend to call them systems. This explains that model-based systems engineering is getting more and more a recommended approach to make sure the product works as expected, fulfills all the needs for the product and creates a foundation for incremental innovation without starting from scratch.

Where the model-based architecture provides a framework for all stakeholders, the 3D CAD model will be the base for a digital thread towards manufacturing. Linking parameters from the logical and functional model towards the physical model a connection is created without the need to create documents or input-files for other disciplines. Adding 3D Annotations to the 3D CAD model and manufacturing process steps related to the model provides a direct connection to the manufacturing process.

The primary challenge of this future approach is to have all these data elements (requirements, functions, components, 3D design instances, manufacturing processes & resources to be connected in a federated environment (the product innovation platform). Connecting, versioning and baselining are crucial for a model-centric approach. This is what initiatives like Industry 4.0 are now exploring through demonstrators, prototypes to get a coherent collection of managed data.

Once we are able to control this collection of managed data concepts of digital twin or even virtual twin can be exploited linking data to a single instance in the field.

Also, the model can serve as the foundation for introduction incremental innovation, bringing in new features.  As the model-based architecture provides direct visibility for change impact (there are no documents to study), it will be extremely lean and cost-efficient to innovate on an existing product.

Advantages of model-centric

  • End-to-end traceability of all data related to a product
  • Extremely efficient in data-handling – no overhead on data-conversions
  • Providing high-quality understanding of the product with reduced effort compared to drawing-centric or item-centric approaches
  • It is scalable to include external stakeholders directly (suppliers/customers) leading to potential different, more beneficial business models
  • Foundation for Artificial Intelligence at any lifecycle step.

Disadvantages of model-centric

  • It requires a fundamentally different way of working compared to past. Legacy departments, legacy people, and legacy data do not fit directly into the model-centric approach. A business transformation is required, not evolution.
  • It is all about sharing data, which requires an architecture that is built to share information across Not through a service bus but as a (federated) platform of information.
    A platform requires a strong data governance, both from the dictionary as well as authorizations which discipline is leading/following.
  • There is no qualified industrial solution from any vendor yet at this time. There is advanced technology, there are demos, but to my knowledge, there is no 100% model-centric enterprise yet. We are all learning. Trying to distinguish reality from the hype.

 

Conclusions

The item-centric approach is the current best practice for most PLM implementations. However, it has the disadvantage that it is not designed for a data-driven approach, the foundation of a digital enterprise. The model-centric approach is new. Some facets already exist. However, for the total solution companies, vendors, consultants, and implementers are all learning step-by-step how it all connects. The future of model-centric is promising and crucial for survival.

Do you want to learn where we are now related to a model-centric approach?
Come to PDT2017 in Gothenburg on 18-19th October and find out more from the experts and your peers.

During my summer holidays, I read some fantastic books to relax the brain. Confessions from Jaume Cabré was an impressive novel, and I finished Yuval Noah Harari’s book Sapiens.

However, to get my PLM-twisted brain back on track, I also decided to read the book “The Death of Expertise” from Tom Nichols, with the thought-provoking subtitle” “The Campaign Against Established Knowledge and Why it Matters.”

I wanted to read it and understand if and how this would apply for PLM.

Tom Nichols is an American, so you understand he has many examples to support his statement from his own experience, like the anti-vaccination “experts”,  the climate change “hoax” and an “expert” tweeting president in his country who knows everything. Besides these obvious examples, Tom explains in a structured way how due to more general education and the internet, the distance between an expert and a average person has disappeared and facts and opinions seem to be interchangeable. I talked about this phenomena during the Product Innovation conference in Munich 2016: The PLM identity crisis.

Further down the book, Tom becomes a little grumpy and starts to complain about the Internet, Google and even about Wikipedia. These information resources provide so often fake or skin-deep information, which is not scientifically proven by experts. It reminded me of a conference that I attended in the early nineties of the previous century.  An engineering society had organized this conference to discuss the issue that finite element analysis became more and more available to laymen. The affordable simulation software would be used by non-trained engineers, and they would make the wrong decisions. Constructions would fall down, machines would fail. Looking back now, we can see the liberation of finite element analysis leads to more usage of simulation technology providing better products and when really needed experts are still involved.

I have the same opinion for internet, Google, and Wikipedia. They rapidly provide information. Still, you need to do fact checking and look at multiple sources, even if you found the answer that you liked already. Usually, when I do my “research” using the internet, I try to find different sources with different opinions and if possible also from various countries. What you will discover is that, when using the internet, there is often detailed information, but not in the headlines of these pages. To get down to the details, we will need experts for certain cases, but we cannot turn the clock back to the previous century.

What about PLM Expertise?

In the case of PLM, it is hard to find real expertise. Although PLM is recognized as a business strategy / a domain / an infrastructure , PLM has so many faces depending on the industry and its application. It will be hard to find an expert who understands it all and I assume headhunters can confirm this. A search for “PLM Consultant” on LinkedIn gives me almost 4000 hits, and when searching for “PLM Expert,” this number is reduced to less than 200. With only one source of information (LinkedIn), these figures do not really give an in-depth result (as expected !)

However, what is a PLM expert? Recently I wrote a post sharing the observation that a lot of PLM product – or IT-focused discussions miss the point of education (see PLM for Small and Medium Enterprises – It is not the software). In this post, I referred to an initiative from John Stark striving for the recognition of a PLM professional. You can read John’s follow up on this activity here: How strong is the support for Professional PLM?  Would a PLM Professional bring expertise?

I believe when a company understands the need for PLM, they have to build this knowledge internally. Building knowledge is a challenge for small and medium enterprises. It is a long-term investment contributing to the viability of the company. Support from a PLM professional can help. However, like the job of a teacher, it is about the skill-set (subjects, experience) and the motivational power of such a person. A certificate won’t help to select a qualified person.

Conclusion

We still need PLM expertise, and it takes time to build it. Expertise is something different as an (internet) opinion. When gaining PLM expertise, use the internet and other resources wisely. Do not go for the headlines of an internet page. Go deeper than the marketing pages from PLM related companies (vendors/implementers). Take time and hire experts to help you, not to release you from your responsibility to collect the expertise.

 

Note: If you want to meet PLM Experts and get a vendor-independent taste of PLM, join me at PDT Europe 2017 on 18-19 October in Gothenburg.  The theme of the conference: Continuous transformation of PLM to support the Lifecycle Model-Based Enterprise.  The conference is preceded on 17th October by CIMdata’s PLM Roadmap Europe 2017. Looking forward to meet you there !

 

 

elevator_thumb.jpgRecently I connected with a fellow countryman, Flip, through LinkedIn and we had a small dialogue related to PLM. Flip describes himself as a millennial thinking loud about PLM and shared some of his thoughts trying to define “the job of PLM.” Instead of keeping it a Dutch dialogue, I would like to open the dialogue to all (millennials), as we need a new generation of PLM consultants

Point 1

observation_thumb.png(Flip) You cannot automate design activities easily, but the rest you can. Isn’t PLM an evolution of 3D Design tooling (and with that the next step in design – theory)

think_thumb.pngYou are right. Historically PLM originated from managing 3D design in a collaborative manner, although at that time we would call it cPDM (Collaborative Product Data Management).  PDM was very design focused. However, PDM also supported the connection to an Engineering Bill of Materials (EBOM) and connected engineering change processes (Engineering Change Request / Engineering Change Order – read more: ECR/ECO for Dummies)

PTC’s Windchill was the first modern cPDM software that still exists. At the same time, Dassault Systemes and Siemens extended the support for design towards the manufacturing planning and execution, introducing the term PLM (Product Lifecycle Management). In the following years, PLM systems started to support the full go-to-market lifecycle as the figure shows below.

lifecycle

This linear go-to-market process is currently rapidly changing because PLM is changing.

plm_txt_thumb.pngThe P standing for Product now represents a System (hardware & software interacting with the environment). The L standing for Lifecycle is also under change.

Support for the Lifecycle of a “product” has changed in two ways. First, the lifecycle is no longer going to be a linear process, but also be more iterative and incremental for the same “product.” Secondly, the lifecycle is stretched to support the “products” in the fields thanks to feedback from sensors (IoT – Internet of Things). That’s why PTC now claims IoT is PLM. Read more: Best Practices or Next Practices.

Finally, the M from Management is under change as thanks to a data-driven approach we should be able to (semi-)automate processes using algorithms. Favorite buzz words here are machine-learning, cobots (collaborative robots) and preventive actions thanks to data analysis & trends.

Point 2

observation_thumb.png (Flip) Storing data in a structured manner creates more complexity (you need to choose what to store). With simulation, complexity could be reduced to make meaningful (design) decisions, so PLM is about clever data hoarding?

image_thumb.pngI believe there is always a challenge with managing structured data for two reasons. People often only create the data they require.  Adding more context more data or a richer context is often considered “extra work,” for with the department is not rewarded or adding more data is not known as these persons do not know the future use of their information. This is a typical exercise for companies now engaging in a digital transformation. (read more: The importance of accurate data)

think_thumb.pngWhen you talk about simulation, I immediately thought about the current trend to work towards a model-based enterprise, where the model is the center of all information. And with the model, we do not only mean the 3D Model but also the functional and logical model which we can simulate. (Read more: Digital PLM requires a Model-Based Enterprise)

Point 3

observation_thumb.png(Flip) Automation from manufacturing with more and more resources requires new ways to drive manufacturing so a team of 8 people can do the work of 80 people through a PLM system?

Industry4Here you are addressing exactly the point that initiatives like Industry 4.0 or in the Netherlands Smart Industry are addressing. Instead of a linear, document-driven process, where each step new versions of information need to be created, the dream is to work around a model (the model-based enterprise).

The idea is that data is flowing through the organization – digital continuity / digital thread – without conversion and by using algorithms and machine learning, the data is consumed and created during the manufacturing process in an automated manner. Indeed, reducing the amount of people involved drastically.

think_thumb.pngI am not sure of we still would call this PLM, it is more a digital enterprise, where digital platforms interact together. PLM could be considered the source for the Product Innovation Platform, but there will also be Execution platforms (ERP and MES as the main source) and customer related platform (CRM as a source). As vendors from all these platforms will provide overlapping functionality, it will be hard to draw exact lines. The main goal for a company will be that the data is flowing and not locked into a proprietary format or systems. And here we still have a lot of work to do,

Conclusion

No conclusion this time as it is an on-going dialogue. Feel free to comment or send your questions, and we can all learn from the dialogue (always better than a monologue).

Your thoughts?

My last blog post was about reasons why PLM is not simple. PLM supporting a well-planned business transformation requires business change / new ways of working. PLM is going through different stages. We are moving from drawing-centric (previous century), through BOM-centric (currently) towards model-centric (current and future). You can read the post here: PLM is not simple!

I was happy to see  my blog buddy Oleg Shilovitsky chimed in on this theme, with his post: Who needs Simple PLM? Oleg reviewed the stakeholders around a PLM implementation. An analytical approach which could be correct in case predictive human beings were involved. Since human beings are not predictive and my focus is on the combination of PLM and human beings, here are some follow comments on the points Oleg made:

 

Customers (Industrial companies)

Oleg wrote:

A typical PLM customer isn’t a single user. A typical PLM buyer is engineering IT organization purchasing software to solve business problem. His interest to solve business problem, but not really to make it simple. Complex software requires more people, an increased budget and can become an additional reason to highlight IT department skills and experience. End-users hate complex software these days,therefore, usability is desired, but not top priority for enterprise PLM.

My comments on this part: PLM becomes more and more an infrastructure for product information along the whole lifecycle. PLM is no longer an engineering tool provided by IT.

There are now many other stakeholders that need product data, in particular when we are moving to a digital enterprise. A model-based approach connects Manufacturing and Service/Operations through a digital thread. It is the business demanding for PLM to manage their complexity. IT will benefit from a reduction in silo applications.

 

PLM Vendors

Oleg wrote:

…most PLM vendors are far away from a desired level of simplicity. Marketing will like “simple” messages, but if you know how to sell complex software, you won’t be much interested to see “simple package” everyone can sell. However, for the last decade, PLM vendors were criticized a lot for complexity of their solutions, so they are pretty much interested how to simplify things and present it as a competitive differentiation.

 

Here we are aligned. All PLM vendors are dreaming of simplifying their software. Imagine: if you have a simple product everyone can use, you would be the market leader and profitable like crazy without a big effort as the product is simple. Of course, this only works, assuming this dream can be realized.

Some vendors believe that easy customization or configuration of the system means simplification. Others believe a simple user-interface is the key differentiator. Compared to mass-consumer software products in the market, a PLM system is still a niche product, with a limited amount of users working with the exact same version of the software. Combined with the particular needs (customizations) every company has (“we are different”), there will never be a simple PLM solution. Coming back to the business transformation theme, human beings are the weakest link.

 

Implementation and Service Providers

Oleg wrote:

Complex software, customization, configuration, know-hows, best practices, installation… you name it.More of these things can only lead to more services which is core business of PLM service providers. PLM industry is very much competitive, but simplicity is not a desired characteristic for PLM when it comes to service business. Guess what… customer can figure it out how to make it and stop paying for services.

Here we are totally aligned. In the past, I have been involved in potential alliances where certain service providers evaluated SmarTeam as a potential tool for their business. In particular, the major PLM service providers did not see enough value in an easy to configure and relatively cheap product. Cheap means no budget for a huge amount of services.

Still, the biggest problem SmarTeam had after ten years was the fact that every implementation became a unique deployment. Hard to maintain and guarantee for the future. In particular, when new functionality was introduced which potentially already existed as customization.  Implementation and service providers will never say NO to a customer when it comes to further customization of the system. Therefore, the customer should be in charge and own the implementation. For making strategic decision support can come from a PLM consultant or coach.

 

PLM Consultants

Here Oleg wrote:

Complex software can lead to good consulting revenues. It was true many years for enterprise software. Although, most of PLM consultants are trying to distant from PLM software and sell their experience “to implement the future”, simplicity is not a favorite word in consulting language. Customer will hire consulting people to figure out the future and how to transform business, but what if software is simple enough to make it happen without consultant? Good question to ask, but most of them will tell you it is not a realistic scenario. Which is most probably true today. But here is the hint – remember the time PC technicians knew how to configured jumpers on PC cards to make printer actually print something?

Here we are not aligned. Business transformations will never happen because of simple tools. People are measured and pushed to optimize their silos in the organization. A digital transformation, which is creating a horizontal flow and transparency of information, will never happen through a tool. The organization needs to change, and this is always driven by a top-down strategy. PLM consultants are valuable to explain the potential future, to coach all levels of the organization. In theory, a PLM consultant’s job is tool independent. However, the challenge of being completely disconnected from the existing tools might allow for dreams that never can be realized. In reality, most PLM consultants are experienced in one or more specific tools they have been implementing. The customer should be aware of that and make sure they own the PLM roadmap.

My conclusion:

Don’t confuse PLM with a tool, simple or complex. All PLM tools have a common base and depending on your industry and company’s vision there will be a short list. However, before you touch the tools, understand your business and the transformation path you want to take. And that is not simple !!

 

Your opinion?

Oleg and I can continue this debate for a long time.  We would be interested in learning your view on PLM and Simplicity – please tune in through the comments section below:

thinkHappy New Year to all of you and I am wishing you all an understandable and digital future. This year I hope to entertain you again with a mix of future trends related to PLM combined with old PLM basics. This time, one of the topics that are popping up in almost every PLM implementation – numbering schemes – do we use numbers with a meaning, so-called intelligent numbers or can we work with insignificant numbers? And of course, the question what is the impact of changing from meaningful numbers towards unique meaningless numbers.

Why did we create “intelligent” numbers?

IntNumberIntelligent part numbers were used to help engineers and people on the shop floor for two different reasons. As in the early days, the majority of design work was based on mechanical design. Often companies had a one-to-one relation between the part and the drawing. This implied that the part number was identical to the drawing number. An intelligent part number could have the following format: A4-95-BE33K3-007.A

Of course, I invented this part number as the format of an intelligent part number is only known to local experts. In my case, I was thinking about a part that was created in 1995, drawn on A4. Probably a bearing of the 33K3 standard (another intelligent code) and its index is 007 (checked in a numbering book). The version of the drawing (part) is A

A person, who is working in production, assembling the product and reading the BOM, immediately knows which part to use by its number and drawing. Of course the word “immediately” is only valid for people who have experience with using this part. And this was in the previous century not so painful as it is now. Products were not so sophisticated as they are now and variation in products was limited.

Later, when information became digital, intelligent numbers were also used by engineering to classify their parts. The classification digits would assist the engineer to find similar parts in a drawing directory or drawing list.

And if the world had not changed, there would be still intelligent part numbers.

Why no more intelligent part numbers?

There are several reasons why you would not use intelligent part numbers anymore.

  1. PerfectWorldAn intelligent number scheme works in a perfect world where nothing is changing. In real life companies merge with other companies and then the question comes up: Do we introduce a new numbering scheme or is one of the schemes going to be the perfect scheme for the future?If this happened a few times, a company might think: Do we have to through this again and again? As probably topic #2 has also occurred.
  2. The numbering scheme does not support current products and complexity anymore. Products change from mechanical towards systems, containing electronic components and embedded software. The original numbering system has never catered for that. Is there an overreaching numbering standard? It is getting complicated, perhaps we can change ? And here #3 comes in.
  3. BarCodeAs we are now able to store information in a digital manner, we are able to link to this complex part number a few descriptive attributes that help us to identify the component. Here the number is becoming less important, still serving as access to the unique metadata. Consider it as a bar code on a product. Nobody reads the bar code without a device anymore and the device connected to an information system will provide the right information. This brings us to the last point #4.
  4. In a digital enterprise, where data is flowing between systems, we need unique identifiers to connect datasets between systems. The most obvious example is the part master data. Related to a unique ID you will find in the PDM or PLM system the attributes relevant for overall identification (Description, Revision, Status, Classification) and further attributes relevant for engineering (weight, material, volume, dimensions).
    In the ERP system, you will find a dataset with the same ID and master attributes. However here they are extended with attributes related to logistics and finance. The unique identifier provides the guarantee that data is connected in the correct manner and that information can flow or connected between systems without human interpretation or human-spent processing time.

GartnerWorkforceAnd this is one of the big benefits of a digital enterprise, reducing overhead in data handling, often reducing the cost of data handling with 50 % or more (people / customizations)

 

What to do now in your company?

There is no business justification just to start renumbering parts just for future purposes. You need a business reason. Otherwise, it will only increase costs and create a potential for migration errors. Moving to meaningless part numbers can be the best done at the moment a change is required. For example, when you implement a new PLM system or when your company merges with another company. At these moments, part numbering should be considered with the future in mind.

augmentedAnd the future is no longer about memorizing part classifications and numbers, even if you are from the generation that used to structure and manage everything inside your brain. Future businesses rely on digitally connected information, where a person based on machine interpretation of a unique ID will get the relevant and meaningful data. Augmented reality  (picture above) is becoming more and more available. It is now about human beings that need to get ready for a modern future.

 

Conclusion

Intelligent part numbers are a best practice from the previous century. Start to think digital and connected and try to reduce the dependency of understanding the part number in all your business activities. Move towards providing the relevant data for a user. This can be an evolution smoothening a future PLM implementation step.

 

clip_image002Looking forward to discussing this topic and many other PLM related practices with you face to face during the Product Innovation conference in Munich. I will talk about the PLM identity change and lead a focus group session about PLM and ERP integration. Looking from the high-level and working in the real world. The challenge of every PLM implementation.

PDT2015-1In this post observations from the PDT 2015 conference which took place in the IVA Conference Center, part of the Royal Swedish Academy of Engineering Services in Stockholm.

The conference was hosted by Eurostep supported by CIMdata, Airbus, Siemens Energy and Volvo AB.

For me, the PDT conference is interesting because there is a focus on architecture and standards flavored with complementary inspiring presentations. This year there were approximate 110 participants from 12 countries coming from different industries listening to 25 presentations spread over two days.

Some highlights

cimdataPeter Bilello from CIMdata kicked off the conference with his presentation: The Product Innovation Platform: What’s Missing.

Peter explained how the joined vision from CIMdata, Gartner and IDC related to a product innovation platform is growing.

The platform concept is bringing PLM to the enterprise level as a critical component to support innovation. The main challenge is to make the complex simple – easier said than done, but I agree this is the real problem of all the software vendors.

Peter showed an interesting graph based on a survey done by CIMdata, showing two trends.

  • The software and technology capabilities are closing more and more the gap with the vision (a dream can come true)
  • The gap between the implemented capabilities and the technical possible capabilities is growing too. Of course, there is a difference between the leaders and followers.

Peter described the three success factors determining if a platform can be successful:

  • Connection: how easy is it for others to connect and plug into the platform to participate as part of the platform. Translated to capabilities this requires the platform to support open standards to connect external data sources as you do not want to build new interfaces for every external source. Also, the platform provider should provide an integration API with a low entry level to get the gravity (next point)
  • Gravity: how well does the platform attract participants, both producers, and consumers. Besides a flexible and targeted user interfaces, there must be an infrastructure that allows companies to model the environment in such a manner that it supports experts creating the data, but also support consumers in data, who are not able to navigate through details and want a consumer-friendly environment.
  • Flow: how well does the platform support the exchange and co-creation of value. The smartphone platforms are extremely simple compared to a business platform as the dimension of lifecycle status and versioning is not there. A business platform needs to have support for versioning and status combined with relating the information in the right context. Here I would say only the classical PLM vendors have in-depth experience with that.

Having read these three bullet points and taking existing enterprise software vendors for PLM, ERP, and other “platforms” in mind, you see there is still a way to go before we have a “real” platform available.

According to Peter, companies should start with anchoring the vision for a business innovation platform in their strategic roadmap. It will be an incremental journey anyway. How clear the vision is connected to business execution in reality differentiates leaders and followers.

gartner

 

Next Marc Halpern from Gartner elaborated on enabling Product Innovation Platforms. Marc started to say that the platform concept is still the process of optimizing PLM.

Marc explained the functional layers making up a product innovation platform, see below

 

Gartner-platform layering

According to Marc, in 2017 the major design, PLM and business suite vendors will all offer product innovation platforms, where certain industries are more likely to implement product innovation platforms faster than others.

Marc stressed that moving to a business innovation platform is a long, but staged, journey. Each stage of the journey can bring significant value.

Gartner has a 5-step maturity model based on the readiness of the organization. Moving from reactive, repeatable, integrating towards collaborating and ultimately orchestrating companies become business ready for PDM first, next PLM and the Product Innovation Platform at the end. You cannot skip one of these steps according to Marc. I agree, PLM implementations in the past failed because the company was dreaming that the PLM system would solve the business readiness of the organization.

Marc ended with a case study and the conclusions were not rocket science.

The importance of change management, management understanding and commitment, and business and IT joined involvement. A known best practice, still we fail in many situations to act accordingly, due to underestimation of the effort. See also my recent blog post: The importance of change management for PLM.

peepoople logoNext session from Camilla Wirseen was a real revelation. Her presentation:  We are all Peepoople – innovation from the bottom of the pyramid.

She described how Anders Wilhemson, original a professor in architecture, focused on solving a global, big problem addressing 2.5 billion people in the world. These 2.5 billion persons, the poorest of the world, lack sanitation, which results in a high death rate for children (every 15 seconds a child dies because of contaminated water). Also the lack of safe places for sanitation lead to girls dropping out of school and women and children being at risk for rape when going to toilet places.

The solution is a bag, made of high-performance biodegradable plastics combined with chemicals, already in the bag, processing the feces to kill potential diseases and make the content available as fertilizer for the agricultural industry.

The plastic bag might not be new, but adding the circular possibilities to it, make it a unique approach to creating a business model providing collection and selling of the content again. For the poorest every cent they can earn makes a different.

peepoople statement

Currently in initial projects the Peepoo system has proven its value: over 95 % user acceptance. It is the establishment that does not want to introduce Peepoo on a larger scale. Apparently they never realized themselves the problems with sanitation.

Peepoo is scaling up and helping the bottom of our society. And the crazy fact is that it was not invented by engineers but by an architect. This is challenging everyone to see where you can contribute to a better world. Have a look at peepoople.cominnovation with an enormous impact!

volvologoNext Volvo Cars and Volvo Trucks presented similar challenges: How to share product data based on external collaboration. The challenge of Volvo Cars is that it has gone through different ownerships and they require a more and more flexible infrastructure to share data. It is not about data pushing to a supplier anymore, it is about integrating partners where you have to share a particular part of your IP with the partner. And where the homegrown KPD system is working well for internal execution, it was never designed for partner sharing and collaboration. Volvo Cars implemented a Shared Technology Control application outside the firewall based on Share-A-space, where inside and outside data is mapped and connected. See their summary below. A pragmatic approach which is bringing direct benefits.

clip_image002[10]

Concluding from the Volvo sessions: Apparently it ‘s hard to extend an existing system or infrastructure for secure collaboration with an external partner. The complexity of access right, different naming conventions, etc. Instead of that it is more pragmatic to have an intermediate system in the middle, like Share-A-space, that connects both worlds. The big advantage of Share-A-space is that the platform is based on the ISO 10303 (PLCS) standard and, therefore, has one of the characteristics of a real platform: openness based on standards.

awesomegroupJonas Hammerberg from the Awesome Group closed day one with an inspiring and eye-opening presentation: Make PLM – The Why and How with Gamification FUN.

Jonas started to describe the behavioral drivers new generations have based on immediate feedback for the feeling of achievement, pride and status and being in a leading environment combined with the feelings of being in a group feeling friendship, trust, and love.

Current organizations are not addressing these different behaviors, it leads to disengagement at the office / work floor as Jonas showed from a survey held in Sweden – see figure. The intrinsic motivation is missing. One of the topics that concerns me the most when seeing current PLM implementations.

engagement

The Awesome group has developed apps and plug-ins for existing software, office and PLM bring in the feelings of autonomy, mastery and purpose to the individual performing in teams. Direct feedback and stimulating team and individual performance as part of the job.

By doing so the organization also gets feedback on the behavior, activity, collaboration and knowledge sharing of individuals and how this related to their performance. An interesting concept to be implemented in situations where gamification makes sense.

clip_image002[12]Owe Lind and Magnus Lidström from Scania talked about their Remote Diagnostics approach where diagnostic readings can be received from a car through a mobile phone network either to support preventive maintenance or actual diagnostics on the road and provide support.

Interesting Owe and Magnus were not using the word IoT (Internet of Things) at all, a hype related to these capabilities. Have a look here on YouTube

clip_image002[14]There was no chance to fall asleep after lunch, where Robin Teigland from the Stockholm School of Economics took us in a whirlwind through several trends under the title: The Third Revolution – exploring new forms of value creation through doing more with less.

The decomposition of traditional business into smaller and must faster communities undermine traditional markets. Also concepts like Uber, Bitcoin becoming a serious threat. The business change as a result of connectivity and communities leading to more and more networks of skills bringing together knowledge to design a car (Local Motors), funding (Kickstarter) – and it is all about sharing knowledge instead of keeping it inside – sharing creates the momentum in the world. You can look at Robin’s presentation(s) at Slideshare here.

future quote

All very positive trends for the future, however, a big threat to the currently established companies. Robin named it the Third Revolution which is in line with what we are discussing in our PLM world, although some of us call it even the Fourth Revolution (Industry 4.0).

image

EignerProfessor Martin Eigner from the Technical University of Kaiserslautern brought us back to reality in his presentation: Industry 4.0 or Industrial Internet: What is the impact for PLM?

Martin stood at the base for what we call PLM and already for several years he is explaining to us that the classical definition for PLM is too narrow. More and more we are developing systems instead of products. Therefore, he prefers the abbreviation SysLM, which is more than 3 characters and therefore probably hard to accept by the industry.

PDMtoSysLM

System development and, therefore, multidisciplinary development of systems introduces a new complexity. Traditional change management for Mechanical CAD (ECO/ECR) is entirely different from how software change management is handled (baselines / branches related to features). The way systems are designed, require a different methodology where systems engineering is an integral part of the development process, see Model-Based Systems Engineering (MBSE).

Next Martin discussed 4 potential IT-architectures where, based on the “products” and business needs, a different balance of PLM, ALM or ERP activities is required.

Martin’s final point was about the need for standards support these architectures, bringing together OSLC, PCLS, etc.
Standards are necessary for fast and affordable integrations and data exchange.

imageMy presentation: The Perfect Storm or a fatal Tsunami was partly summarizing topics from the conference and, in addition, touching on two topics.

The first topic is related to big data and analytics.  Many are trying to get a grip on big data with analytics. However, the real benefit of big data comes when you are able to apply algorithms to it. Gartner just made an interesting statement related to big data (below) and Marc Halpern added to this quote that there is an intrinsic need for data standards in order to apply algorithms.

Gartner algorithms

When algorithms can be used, classical processes like ECO, ECR or managers might become obsolete and even a jobs like an accountant is at risk. This as predicted in article in the Economist in February 2014 – the onrushing Wave

The second topic, where I believe we are still hesitating too long at management level, is making decisions, to anticipate the upcoming digital wave and all of its side effects. We see a huge wave coming. If we do not mobilize the people, this wave might be a tsunami for those still at the seaside

Conclusion: PDT2015 was an inspiring, well-balanced conference with excellent opportunity to network with all people attending. For those interested in the details of the PLM future and standards an ideal opportunity to get up to date. And next the challenge: Make it happen at your company!

.. if you reach this point, my compliments for your persistency to read it all. Too long for a blog post and even here I had to strip

 

classificationIn my previous post describing the various facets of the EBOM, I mentioned several times classification as an important topic related to the PLM data model. Classification is crucial to support people to reuse information and, in addition, there are business processes that are only relevant for a particular class of information, so it is not only related to search/reuse support.

In 2008, I wrote a post about classification, you can read it here. Meanwhile, the world has moved on, and I believe more modern classification methods exist.

Why classification ?

searchFirst of all classification is used to structure information and to support retrieval of the information at a later moment, either for reuse or for reference later in the product lifecycle. Related to reuse, companies can save significant money when parts are reused. It is not only the design time or sourcing time that is reduced. Additional benefits are lower risks for errors (fewer discoveries), reduced process and approval time (human overhead), reduced stock (if applicable), and more volume discount (if applicable) and reduced End-Of-Life handling.

An interesting discussion about reuse started by Joe Barkai can also be found on LinkedIn here, including interesting comments

Classification can also be used to control access to certain information (mainly document classification), or classification can be used to make sure certain processes are followed, e.g. export control, hazardous materials, budget approvals, etc. Although I will speak mainly about part classification in this post, classification can be used for any type of information in the PLM data model.

Classification standards

din4000Depending on the industry you are working in, there are various classification standards for parts. When I worked in the German-speaking countries (the DACH-länder) the most discussed classification at that time was DIN4000 (Sachmerkmal-liste), a must have standard for many of the small and medium sized manufacturing companies. The DIN 4000 standard had a predefined part hierarchy and did not describe the necessary properties per class. I haven’t met a similar standard in other countries at that time.

Another very generic classification I have seen are the UNSPC standard, again a hierarchical classification supporting everything in the universe but no definition of attributes.

15926Other classification standards like ISO13399, RosettaNET, ISO15926 and IFC exist to support collaboration and/or the supply chain. When you want to exchange data with other disciplines or partners. The advantage of a standard definition (with attributes) is that you can exchange data with less human processing (saving labor costs and time – the benefit of a digital enterprise).

I will not go deeper into the various standards here as I am not the expert for all the standards. Every industry has its own classification standards, a hierarchical standard, and if more advanced the hierarchy is also supported by attributes related to each class. But let´s go into the data model part.

Classification and data model

clip_image002The first lesson I learned when implementing PLM was that you should not build your classification hard-coded into the PLM, data model. When working with SmarTeam is was very easy to define part classes and attributes to inherit. Some customers had more than 300 classes represented in their data model just for parts. You can imagine that it looks nice in a demo. However when it comes to reality, a hard-coded classification becomes a pain in the model. (left image, one of the bad examples from the past)

1 – First of all, classification should be dynamic, easy to extend.

2 – The second problem however with a hard-coded classification was that once a part is defined for the first time the information object has a fixed class. Later changes need a lot of work (relinking of information / approval processes for the new information).

3 – Finally, the third point against a hard-coded classification is that it is likely that parts will be classified according to different classifications at the same time. The image bellow shows such a multiple classification.

multiclass

So the best approach is to have a generic part definition in your data model and perhaps a few subtypes. Companies tend to differentiate still between hardware (mechanical / electrical) parts and software parts.

Next a part should be assigned at least to one class, and the assignment to this class would bring more attributes to the part. Most of the PLM systems that support classification have the ability to navigate through a class hierarchy and find similar parts.

When parts are relevant for ERP they might belong to a manufacturing parts class, which add particular attributes required for a smooth PLM – ERP link. Manufacturing part types can be used as templates for ERP to be completed.

This concept is also shared by Ed Lopategui as commented to my earlier post about EBOM Part types. Ed states:

Think part of the challenge moving forward is we’ve always handled these as parts under different methodologies, which requires specific data structures for each, etc. The next gen take on all this needs to be more malleable perhaps. So there are just parts. Be they service or make/buy or some combination – say a long lead functional standard part and they would acquire the properties, synchronizations, and behaviors accordingly. People have trouble picking the right bucket, and sometimes the buckets change. Let the infrastructure do the work. That would help the burden of multiple transitions, where CAD BOM to EBOM to MBOM to SBOM eventually ends up in a chain of confusion.

I fully agree with his statement and consider this as the future trend of modern PLM: Shared data that will be enriched by different usage through the lifecycle.

Why don’t we classify all data in PLM?

There are two challenges for classification in general.

  • The first one is that the value of classification only becomes visible in the long-term, and I have seen several young companies that were only focusing on engineering. No metadata in the file properties, no part-centric data management structure and several years later they face the lack of visibility what has been done in the past. Only if one of the engineers remembers a similar situation, there is a chance of reuse.
  • The second challenge is that through a merger or acquisition suddenly the company has to manage two classifications. If the data model was clean (no hard-coded subclasses) there is hope to merge the information together. Otherwise, it might become a painful activity to discover similarities.

SO THINK AHEAD EVEN IF YOU DO NOT SEE THE NEED NOW !

Modern search based applications

There are ways to improve classification and reuse by using search-based application which can index archives and try to find similarity in properties / attributes. Again if the engineers never filled the properties in the CAD model, there is little to nothing to recover as I experienced in a customer situation. My PLM US peer, Dick Bourke, wrote several articles about search-based applications and classification for engineering.com, which are interesting to read if you want to learn more: Useful Search Applications for Finding Engineering Data

So much to discuss on this topic, however I reached my 1000 words again Sad smile

Conclusion

Classification brings benefits for reuse and discovery of information although benefits are long-term. Think long-term too when you define classifications. Keep the data model simple and add attributes groups to parts based on functional classifications. This enables a data-driven PLM implementation where the power is in the attributes not longer in the part number. In the future, search-based applications will offer a quick start to classify and structure data.

 

imageSomeone notified me that not everyone subscribed to my blog necessary will read my posts on LinkedIn. Therefore I will repost the upcoming weeks some of my more business oriented posts from LinkedIn here too. This post was from July 3rd and an introduction to all the methodology post I am currently publishing.

image

The importance of a (PLM) data model

thinkWhat makes it so hard to implement PLM in a correct manner and why is this often a mission impossible? I have been asking myself this question the past ten years again and again. For sure a lot has to do with the culture and legacy every organization has. Imagine if a company could start from scratch with PLM. How would they implement PLM nowadays?

My conclusion for both situations is that it all leads to a correct (PLM) data model, allowing companies to store their data in an object-oriented manner. In this way reflecting the behavior the information objects have and the way they mature through their information lifecycle. If you making compromises here, it has an effect on your implementation, the way processes are supported out-of-the-box by a PLM system or how information can be shared with other enterprise systems, in particular, ERP. PLM is written between parenthesis as I believe in the future we do not talk PLM or ERP separate anymore – we will talk business.

Let me illustrate this academic statement.

A mid-market example

imageWhen I worked with SmarTeam in the nineties, the system was designed more as a PDM system than a PLM system. The principal objects were Projects, Documents, and Items. The Documents had a sub-grouping in Office documents and CAD documents. And the system had a single lifecycle which was very basic and designed for documents. Thanks to the flexibility of the system you could quickly implement a satisfactory environment for the engineering department. Problems (and customizations) came when you wanted to connect the data to the other departments in the company.

The sales and marketing department defines and sells products. Products were not part of the initial data model, so people misused the Project object for that. To connect to manufacturing a BOM (Bill of Material) was needed. As the connected 3D CAD system generated a structure while saving the assemblies, people start to consider this structure as the EBOM. This might work if your projects are mechanical only.

However, a Document is not the same as a Part. A Document has a complete different behavior as a Part. Documents have continuous iterations, with a check-in/checkout mechanism, where the Part definition remains unchanged and gets meanwhile a higher maturity.

The correct approach is to have the EBOM Part structure, where Part connect to the Documents. And yes, Documents can also have a structure, but it is not a BOM. SmarTeam implemented this around 2004. Meanwhile, a lot of companies had implemented their custom solution for EBOM by customization not matching this approach. This created a first level of legacy.

When SmarTeam implemented Part behavior, it became possible to create a multidisciplinary EBOM, and the next logical step was, of course, to connect the data to the ERP system. At that time, most implementations have been pushing the EBOM to the ERP system and let it live there further. ERP was the enterprise tool, SmarTeam the engineering tool. The information became disconnected in an IT-manner. Applying changes and defining a manufacturing BOM was done manually in the ERP system and could be done by (experienced) people that do not make mistakes.

Next challenge comes when you want to automate the connection to ERP. In that case, it became apparent that the EBOM and MBOM should reside in the same system. (See old and still actual post with comments here: Where is the MBOM) In one system to manage changes and to be able to implement these changes quickly without too much human intervention. And as the EBOM is usually created in the PLM system, the (commercial/emotional) PLM-ERP battle started. “Who owns the part definition”, “Who owns the MBOM definition” became the topic of many PLM implementations. The real questions should be: “Who is responsible for which attributes of the Part ?” and “Who is responsible for which part of the MBOM definition ?” as data should be shared not owned.

The SmarTeam evolution shows how a changing scope and an incomplete/incorrect data model leads to costly rework when aligning to the mainstream. And this is happening with many implementation and other PLM systems. In particular when the path is to grow from PDM to PLM. An important question remains what is going to be mainstream in the future. More on that in my conclusion.

A complex enterprise example

In the recent years, I have been involved in several PLM discussions with large enterprises. These enterprises suffer from their legacy. Often the original data management was not defined in an object-oriented manner, and the implementation has been expanding with connected and disconnected systems like a big spaghetti bowl.

The main message most of the time is:

“Don’t touch the systems it as it works for us”.

The underlying message is;

“We would love to change to a modern approach, but we understand it will be a painful exercise and how will it impact profitability and execution of our company”

The challenge these companies have is that it extremely hard to imagine the potential to-be situation and how it is affected by the legacy. In a project that I participated several years ago the company was migrating from a mainframe database towards a standard object-oriented (PLM) data model. The biggest pain was in mapping data towards the object-oriented data model. As the original mainframe database had all kind of tables with flags and mixed Part & Document data, it was almost impossible to make a 100 % conversion. The other challenge was that knowledge of the old system had vaporized. The result at the end was a customized PLM data model, closer to current reality, still containing legacy “tricks” to assure compatibility.

All these enterprises at a particular time have to go through such a painful exercise. When is the best moment? When business is booming, nobody wants to slow-down. When business is in a lower gear, costs and investments are minimized to keep the old engine running efficiently. I believe the latter would be the best moment to invest in making the transition if you believe your business will still exist in 10 years from now.

Back to the data model.

Businesses should have today a high-level object-oriented data model, describing the main information objects and their behavior in your organization. The term Master Data Management is related to this. How many companies have the time and skills to implement a future-oriented data model? And the data model must stay flexible for the future.

knowledgeCompare it to your brain, which also stores information by its behavior and by learning the brain understands what it logically related. The internal data model gets enriched while we learn.

Once you have a business data model, you are able to implement processes on top of it. Processes can change over time, therefore, avoid hard-coding specific processes in your enterprise systems. Like the brain, we can change our behavior (applying new processes) still it will be based on the data model stored inside our brain.

Conclusion:

A lot of enterprise PLM implementations are in a challenging situation due to legacy or incomplete understanding and availability of an enterprise data model. Therefore cross-department implementations and connecting others systems are considered as a battle between systems and their proprietary capabilities.

image

The future will be based on business platforms and realizing this take years – imagine openness and usage of data standards. An interesting conference to attend in the near future for this purpose is the PDT2015 conference in Stockholm.

Meanwhile I also learned that a  one-day Master Data Management workshop will be held before the PDT2015 conference starts on the 12th of October. A good opportunity to deep-dive for three days !

dummies_logo

 

In my earlier posts, I described generic PLM data model and practices related to Products, BOMs en recently EBOM and (CAD) Documents. This time I want to elaborate a little bit more on the various EBOM characteristics.

 

The EBOM is the place where engineering teams collaborate and define the product. A released EBOM is supposed to give the full engineering specification how a product should behave including material quality and tolerances. This makes it different from the MBOM, which contains the specification of how this product should be manufactured based on exact components and materials.

Depending on the type of product there are several EBOM best practices which I will discuss here (briefly) in alphabetical order:

EBOM & Buy Part

PDM_ERP_AML_AVLUsually, an EBOM consists of Make and Buy parts –an attribute on the EBOM part indicates the preferred approach. Make parts are typically sourced towards qualified suppliers, where Buy parts can be more generic and based on qualified vendors. Engineering specifies who are the approved Manufacturers for the part (AML) and purchasing decides who are the approved Vendors for this part (AVL). In general Buy parts do not need an engineering efforts every time the part is used in a product.

EBOM & CAD related

My previous post already discussed some of the points related to EBOM and CAD Documents. Here I want to extend a little more addressing the close relation between MCAD parts and EBOM parts. In particular in the Engineering To Order industry, there is, most of the time, no standard product to relate to. In that case, Mechanical CAD can be the driver for the EBOM definition and usually EBOM Make parts are designed uniquely. The challenge is to understand similar parts that might exist and reuse them. Classification (and old post here) and geometric search capabilities support the modern engineer. I will come back to classification in a later post

EBOM – Configuration Item

cmiiIn case a product is designed for mass production throughout a longer lifetime, it becomes necessary to manage the product configuration over time. How is the product is defined today and avoid the need to have for each product variant a complete EBOM to manage. The EBOM can be structured with Options and Variants. In that case, having Configuration Items in the EBOM is crucial. The Configuration Item is the top part that is versioned and controlled. Parts below the configuration item, mostly standard parts do not impact the version of the Configuration Item as long as the Form-Fit-Function from the Configuration Item does not change. Configuration Management is a topic on its own and some people believe PLM systems were invented to support Configuration Management.

EBOM – Company Standard Part

Standard Parts are often designed parts that should be used across various products or product lines. The advantage of company standard parts is that it reduces costs throughout the whole product lifecycle. Less design time, less manufacturing setup time and material sourcing effort and potential lower material cost thanks to higher volumes. Any EBOM part could become at a certain moment a Company Standard part and it is recommended to use a classification related to these parts. Otherwise they will not be found again. As mentioned before I will come back to classification.

EBOM – Functional group

Sometimes during the design of a product, several parts are logically grouped together from the design point of view, either because they are modular or because they always appear as a group of parts.

The EBOM, in that case, can contain phantom parts, which do not represent an end item. These phantom parts assist the company in understanding changing one of the individual parts in this functional group.

EBOM – Long Lead

In typical Engineering to Order or Build To Order deliveries there are components on the critical path of the product delivery. Components with a long lead time should be identified and ordered as early as possible during the delivery process. Often the EBOM is not complete or mature enough to pass through all the information to ERP. Therefore Long Lead items require a fast track towards ERP and a special status in the EBOM reflecting its ordering status. Long Lead items are the example where a company can benefit from a precise interaction between PLM and ERP with various status handshakes and approvals during the delivery process

EBOM – Make parts

Make Parts in an EBOM are usually specified by their related model and drawings. Therefore Make Parts usually have revisions but be aware that they do not follow the same versioning of the related model or drawing. A Make Part is in an In Work status as long as the EBOM is not released. Once the model is approved, the EBOM part can be approved or released. Often companies do not want to release the data as long as manufacturing is not completed. This to make sure that the first revision comes out at the first delivery of the product.

EBOM – Materials

In many mechanical assemblies, the designer specifies materials with a particular length. For example a rubber strip, tubing / piping. When extracting the information from the 3D CAD assembly, this material instance will get a unique identifier. Here it is important that the Material Part has an attribute that describes the material specification. In the ideal data model, this is a reference to a Materials library. Next when manufacturing engineering is defining the MBOM, they can decide on material quantities to purchase for the EBOM Material.

EBOM – Part Number

QRThis could be a post on its own. Do we need intelligent part numbers or can we use random generated unique numbers? I have a black and white opinion about that. If you want to achieve a digital enterprise you should aim for random generated unique numbers. This because in a digital enterprise data is connected without human transfer. The PLM and ERP link is unambiguous. Part recognition at the shop floor can be done with labels and scanning at the workstation. There is no need for a person to remember or transfer information from one system or location by understanding the part number. The uniquely generated number make sure every person will have a look at the digital metadata online available. Therefore immediately seeing a potential status change or upcoming engineering change. Supporting the intelligent numbering approach allows people to work disconnected again, therefore not guaranteeing that an error-free activity takes place. People make mistakes, machines usually not.

EBOM – Service Parts

It is important to identify already in the EBOM which parts need to be serviced in operation and engineering should relate the service information already to the EBOM part. This could be the same single part with a different packaging or it could be a service kit plus instructions linked to the part. In a PLM environment, it is important that this activity is done upfront by engineering to avoid later retrieval of the data and work again on service information. A sensitive point here is that engineers currently in the classical approach are not measured on the benefits they deliver downstream when the products are in the field. Too many companies work here in silos.

EBOM – Standard Parts

3dFinally, as I reach already the 1000 words, a short statement about EBOM standard parts. These standard parts, based on international or commercial standards do not need a revision and often they have a specification sheet, not necessary a 3D model for visualization. Classification is crucial for Standard Part and here I will write a separate post about dealing with Standard Parts, both mechanical and electrical.

Concluding: this post we can see that the EBOM is having many facets and based on the type of EBOM part different behavior is expected. It made me realize PLM is not that simple as I thought. In general when defining an EBOM data model you would try to minimize the specific classes for the EBOM part. Where possible, solve it with attributes (Make/Buy – Long Lead – Service – etc.). Use classification to store specific attributes per part type related to the part. Classification will be my next topic as it appears

Feel free to jump on any of the EBOM characteristics for an extended discussion

note: images borrowed from the internet contain links to the original location where I found them. The context there is not always relevant for this post.

%d bloggers like this: