A week ago, Martijn Dullaart and Martin Haket, both well skilled in configuration management, published a post on MDUX.net and LinkedIn where they pledged for standardization across domains, first of all when supporting processes between PLM and ERP – read the full article here: We need a cross-platform interface for impact analysis!
In the article, some standards with various scopes related to product information are mentioned (PDX, ISO 10303, ISO305:2017), and the article suggests that impact analysis should be done in an overarching domain, the CM domain, outside PLM. See the image on the left. I have several issues with this approach, which I will explain here.
PLM and/or CM are overarching domains
The diagram puts CAD, SW, PLM, and ERP as verticals, where I would state PLM is responsible for the definition of the Product, which means governing CAD and SW and publishing towards ERP for execution. You might discuss if CM is part of PLM or that CM is a service on top of PLM.
We had this discussion before: PLM and Configuration Management – a happy marriage? and one of my points was that in aerospace, defense, and automotive companies actively invest in CM. If you go to other industries and sizes of business, CM becomes more an intention than a practice.
Somehow the same challenge PLM has when it comes to full lifecycle support.
Now let’s look at impact analysis
During my personal PLM lifecycle, I discussed with many companies how a PLM system could provide a base for impact analysis. These scenarios were mainly based on a Where-Used analysis in the context of an engineering change, which PLM should be able to offer.
PLM would provide the information in which actual running products or upcoming products are impacted by a potential change – this would provide the technical answer and the impact on production.
To support the financial case, a more advanced impact analysis was required. This is often a manual process. In more advanced cases customization is used, to provide real-time information about the current warehouse stock (scrap?) and potential ordered parts/materials.
I could imagine some other potential impacts to analyze, for example, the marketing/sales plans, but haven’t come across these situations in my projects.
To start from a well-thought approach, I expected more from the article, Martijn and Martin wrote:
A good candidate to be used as interface between the expert domains and CM domain is, in our opinion, the CM2 impact matrix. This captures the information on an aggregate level like a part, or document or dataset. This aggregate level can be used by other expert domains to identify impact within their scope or by the CM domain to support cost estimation and implementation planning.
So I followed the link to discover and digest the CM2 impact matrix. However, the link leads to CM2 training, not directly useful information – the impact matrix. Should I get CM2 trained before to gain access to the information?
This is the same lock-in where a PLM Vendor will state:
Buy our system and use our impact analysis.
I believe every respectable PLM system has a base for impact analysis and probably a need to be customized for outside data. Martijn and Martin agree on that, as they write:
This interface is currently not existent in the offerings of the various vendors. If an impact matrix is available, it is to support the impact analysis within the tool of a vendor not to support impact analysis within a business. That is why Martin and I challenge the vendors in the various expert domains to come with a standard to allow businesses to perform a high-quality cross-functional impact analysis that improves the quality of decision making.
Vendors coming with a standard?
In particular, the sentences challenging the vendors to come with a standard is a mission impossible, to my opinion.
A vendor will never come with a standard unless THEY become the standard.
CATIA in Aerospace/Automotive, DXF in 2D mid-market CAD, IFC for the building market, Excel for calculations, are all examples where one vendor dominates the market. I do not believe there will be in any R&D department of a software company, people working spontaneously on standards,
Companies have to develop and push for standards
Standards have always been developed because there was a need to exchange information, most of the time needed for an exchange between companies – OEMs – Suppliers – partners. In the case of impact analysis, the target might be slightly different. Impact analysis is mainly focusing on internal systems within the organization that is planning a change.
And this makes the push for standardization again more complicated.
Let me explain why:
First, there is ERP – the image above shows Management of Change in the SAP help environment.
In most companies, the ERP-system is the major IT-system and all efforts to automate processes were targeted to be solved within ERP.
Therefore, you will find basic impact analysis capabilities, mainly related to the execution side: actual stock, planned production orders, and logistics in ERP. The rest of impact analysis is primarily a manual task.
Next, with the emergence of PLM-systems, impact analysis shifted towards the planning side: Where Used or Where Related became capabilities related to engineering change request. In my SmarTeam days we developed templates for that:
Where the Analysis was still a manual action, where the PLM-system would provide Where Used support and potentially a custom ERP-connection would give some additional information.
Nowadays, I would state all PLM vendors have the technical capabilities to create an impact analysis dashboard. Aras by rapid customization, Dassault Systemes by using Exalead, PTC by using Navigate and Siemens by using Mendix – so technology exists, but what about standards?
In the comments section to the LinkedIn post, Martijn mentions that the implemented change behavior in PLM is not exactly as he (or CM2 methodology) would propose. – the difficulty in the happy marriage between PLM and CM. See his comment here.
For me, these comments are change requests to the PLM vendors, and they will be only heard when there is a push from the outside world.
Therefore my (simplified) proposal:
- Start an Impact Analysis community outside CM2 as there are many companies not following CM2; still they have their particular ways of working. Perhaps this community exists and lacks visibility – I am happy to learn.
- Describe the potential processes and people involved and collect/combine the demands – think tool independent as this is the last step.
- Publish the methodology as an open standard and have it rated by the masses. The rating will influence the software vendors in the market.
Conclusion
Asking vendors to come with a cross-platform interface standard for impact analysis is a mission impossible. Standards appear when there is a business need that needs to come from the market. Impact Analysis has an additional difficulty as it is mostly a company-internal process.
3 comments
Comments feed for this article
October 7, 2019 at 10:50 am
Paul Garrish
I always major on the ‘spaghetti’ in PLM as paying back when it comes to impact analysis. Creating all the links between parts, documents, tools etc, is expensive during design/definition, but when you start to impact assess a change, it becomes an invaluable aid to identifying what may be affected. Even when the detail (stock levels, price etc) lives elsewhere, having something in PLM means you can fully manage the ECR process with real objects under control.
The crucial thing, as ever, is the process and the people who have to operate it. Full impact assessment is slow and expensive – until you don’t do it and make a bad decision (wiping out millions in spares stock through a tiny production easement change for example).
One companies buy into the payback from proper change management, then the standards issue will start to sort itself as firms push their vendors to support it. Whilst their customers play at it, there’s little incentive for vendors to take much interest.
Thanks Paul, I fully agree. At the end the need for impact analysis depends on the effect on your business when things go wrong. This can be even a challenge for the best known aerospace companies. Best regards Jos
LikeLike
October 7, 2019 at 2:40 pm
jfvanoss
Jos,
The initial concept you presented seems quite CM centric; I perspective that I do not share. I agree that there doesn’t need to be a standard around impact analysis (there are already too many standards in a number of related areas, which means that there are no standards), I think the impact analysis problem can be broken down to a simple parent-child relationship.The thing that gets impacted becomes the parent (it could be a part with a bad feature, process, or test fixture, etc.) and then one has to conduct a scavenger hunt for all the children, grandchildren, great grandchildren, etc. This could be accomplished in the ERP but it would not capture the non-ERP artifacts that might be important: materials, test fixtures, processes, etc. The problem is no one system can track this (ad-hoc) relationship. A well conceived and implemented PLM has a better chance of success than an ERP.
Thanks Jim, indeed supporting my thoughts that impact analysis is quite different for many of us. I think we are aligned when you say: parent-child I use the terminology Where-Used (strict in the same object class) and Where-Related (strong dependencies to other artifacts). And in particular these last relations are best visible in a well implemented PLM. Best regards, Jos
LikeLike
November 22, 2019 at 4:23 pm
Simon Kooij
Hi Jos,
I Support the statement/fundamentals in your Blog,
Especially that PLM is covering CAD and SW and publishing towards ERP…
And that CM is (should and can be) part of PLM…
And in specific (complex) business situations (Automotive or aerospace probably) CM as a service on top of PLM.
In a lot of cases it is a well defined Workflow for Change management, with basic info (Where Used/Where Related) from PLM and additional info from other systems like ERP (Stock level, activity date, costs,..)
Our goal is to keep it simple by covering Configuration management in PLM,
Meaning with the topics:
• Product definition (3D – MBSE -RFLP)
• Product variant selection (supporting CTO)
(When you have a direct connection from sales configurator (eg Tacton)
and PLM (with the well defined modular product structure),
then there is no extra Configuration tool (like eg ConfigIT) necessary.
Unless the situation is more complex and additional information (effectivity dates, ..) are necessary from other systems).
• Life cycle management as: EBOM – MBOM – SBOM
• Product management (control) – Change management (NCR/ECR – CCB – ECO/MCO (CA/CO)). Based on (simplified) CMII methodology.
And later when necessary improved CMII methodology what will
be suitable for MBSE and digital concepts/definitions.
And yes impact analysis can different in the several business types/risks.
And sometimes improved by BI-tools (Exalead for unstructured data) or Geometrical compare searches, etc, beside the manual actions.
But key is to understand your business, risks and define a methodology (supported in PLM) what will support the change process for the daily users to keep the products in the right (configurable) shape.
Best Regards,
Simon
Thanks Simon, nothing to comment. Best regards Jos
LikeLike