In my previous post, I discovered that my header for this series is confusing. Although a future implementation of system lifecycle management (SLM/PLM) will rely on models, the most foundational change needed is a technical one to create a data-driven infrastructure for connected ways of working.
My previous article discussed the concept of the dataset, which led to interesting discussions on LinkedIn and in my personal interactions. Also, this time Matthias Ahrens (HELLA) shared again a relevant but very academic article in this context – how to harmonize company information.
For those who want to dive deeper into the concept of connected datasets, read this article: The euBusinessGraph ontology: A lightweight ontology for harmonizing basic company information.
The article illustrates that the topic is relevant for all larger enterprises (and it is not an easy topic).
This time I want to share my thoughts about the two statements from my introductory post, i.e.:
A model-based approach with connected datasets seems to be the way forward. Managing data in documents will become inefficient as they cannot contribute to any digital accelerator, like applying algorithms. Artificial Intelligence relies on direct access to qualified data.
A model-based approach with connected datasets
We discussed connected datasets in the previous post; now, let’s explore why models and datasets are related. In the traditional CAD-centric PLM domain, most people will associate the word model with a CAD model, to be more precise, the 3D CAD Model. However, there are many other types of models used related to product development, delivery and operations.
A model can be a:
Physical Model
- A smaller-scale object for the first analysis, e.g., a city or building model, an airplane model
Conceptual Model
- A conceptual model describes the entities and their relations, e.g., a Process Flow Diagram (PFD)
A mathematical model describes a system concept using a mathematical language, e.g., weather or climate models. Modelica and MATLAB would fall in this category
- A CGI (Computer Generated Imagery) or 3D CAD model is probably the most associated model in the mind of traditional PLM practitioners
- Functional and Logical Models describing the services and components of a system are crucial in an MBSE
Operational Model
- A model providing performance analysis based on (real-time) data coming from selected data sources. It could be an operational business model, an asset performance model; even my Garmin’s training performance model is such an operating model.
The list of all models above is not extensive nor academically defined. Moreover, some model term definitions might overlap, e.g., where would we classify software models or manufacturing models?
All models are a best-so-far approach to describing reality. Based on more accurate data from observations or measurements, the model comes closer to what happens in reality.
A model and its data
Never blame the model when there is a difference between what the model predicts and the observed reality. It is still a model. That’s why we need feedback loops from the actual physical world to the virtual world to fine-tune the model.
Part of what we call Artificial Intelligence is nothing more than applying algorithms to a model. The more accurate data available, the more “intelligent” the artificial intelligence solution will be.
By using data analysis complementary to the model, the model may get better and better through self-learning. Like our human brain, it starts with understanding the world (our model) and collecting experiences (improving our model).
There are two points I would like to highlight for this paragraph:
- A model is never 100 % the same as reality – so don’t worry about deviations. There will always be a difference between virtual predicted and physical measured – most of the time because reality has much more influencing parameters.
- The more qualified data we use in the model, the closer to reality – so focus on accurate (and the right) data for your model. Although, as most of the time, it is impossible to fully model a system, focus on the most significant data sources.
The ultimate goal: THE DIGITAL TWIN
The discussion related to data-driven and the usage of models might feel abstract and complex (and that’s the case). However the term “digital twin” is well known and even used in board rooms.
The great benefits of a digital twin for business operations and for sustainability are promoted by many software vendors and consultancy firms.
My statement and reason for this series of blog posts: Digital Twins do not run on documents, you need to have a data-driven, model-based infrastructure to efficiently benefit from digital twin concepts.
Unfortunate a reliable and sustainable implementation of a digital twin requires more than software – it is a learning journey to connect the right data to the right model.
A puzzle every company has to solve as there is no 100 percent blueprint at this time.
Are Low Code platforms the answer?
I mentioned the importance of accurate data. Companies have different systems or even platforms managing enterprise data. The digital dream is that by combining datasets from different systems and platforms, we can provide to any user the needed information in real-time. My statement from my introductory post was:
I don’t believe in Low-Code platforms that provide ad-hoc solutions on demand. The ultimate result after several years might be again a new type of spaghetti. On the other hand, standardized interfaces and protocols will probably deliver higher, long-term benefits. Remember: Low code: A promising trend or a Pandora’s Box?
Let’s look into some of the low-code platform messages mentioned by Low-Code advocates:
You will have an increasingly hard time finding developers to keep up with global app development demands (reason #1 for PEGA)
This statement reminded me of the early days of SmarTeam implementations. With a Data model Wizard, a Form Designer, and a Visual Basic COM API, you could create any kind of data management application with SmarTeam. By using its built-in behaviors for document lifecycle management, item lifecycle management, and CAD integrations combined with easy customizations.
The sky was the limit to satisfy end users. No need for an experienced partner or to be a skilled programmer (this was 2003+). SmarTeam was a low-code platform the marketing department would say now.
A lot of my activities between 2003 and 2010 were related fixing the problems related to flexibility, making sense (again) of customizations. I wrote about this in a 2015 post: The importance of a (PLM) data model sharing the experiences of “fixing” issues created to flexibility.
Think first
The challenge is that an enthusiastic team creates a (low code) solution rapidly. Immediate success is celebrated by the people involved. However, the future impact of this solution is often forgotten – we did the job, right?
Documentation and a broader visibility are often lacking when implementing such a solution.
For example, suppose your product data is going to be consumed by another app. In that case, you need to make sure that the information you consume is accurate. On the other hand, perhaps the information was valid when you created the app.
However, if your friendly co-worker has moved on to another job and someone with different data standards becomes responsible for the data you consume, the reliability might fail. So how do you guarantee its quality?
Easy tools have often led to spaghetti, starting from Clipper (the old days), Visual Basic (the less old days) to highly customizable systems (like Aras is promoting) and future low-code platforms (and Aras is there again).
However, the strength of being highly flexible is also the weaknesses if not managed and understood correctly. In particular, in a digital enterprise architecture, you need skilled people who guarantee a reliable anchorage of the solution.
The HBR article When Low-Code/No-Code Development Works — and When It Doesn’t mentions the same point:
There are great benefits from LC/NC software development, but management challenges as well. Broad use of these tools institutionalizes the “shadow IT” phenomenon, which has bedeviled IT organizations for decades — and could make the problem much worse if not appropriately governed. Citizen developers tend to create applications that don’t work or scale well, and then they try to turn them over to IT. Or the person may leave the company, and no one knows how to change or support the system they developed.
The fundamental difference: from coordinated to connected
For the moment, I remain skeptical about the low-code hype, because I have seen this kind of hype before. The most crucial point companies need to understand is that the coordinated world and the connected world are incompatible.
Using new tools based on old processes and existing data is not a digital transformation. Instead, a focus on value streams and their needed (connected) data should lead to the design of a modern digital enterprise, not the optimization and connectivity between organizational siloes.
Before buying a tool (a medicine) to reduce the current pains, imagine your future ways of working, discover what is possible with your existing infrastructure and identify the gaps.
Next, you need to analyze if these gaps are so significant that it requires a technology change. Probably it does, as historically, systems were not designed to share data horizontally in an organization.
In this context, have a look at Lionel Grealou’s s article for Engineering.com:
Data Readiness in the new age of digital collaboration.
Conclusion
We discussed the crucial relation between models and data. Models have only value if they acquire the right and accurate data (exercise 1).
Next, even the simplest development platforms, like low-code platforms, require brains and a long-term strategy (exercise 2) – nothing is simple at this moment in transformational times.
The next and final post in this series will focus on configuration management – a new approach is needed. I don’t have the answers, but I will share some thoughts
A recommended event and an exciting agenda and a good place to validate and share your thoughts.
I will be there and look forward to meeting you at this conference (unfortunate still virtually)
4 comments
Comments feed for this article
November 1, 2021 at 3:21 am
Max Gravel
Hello Jos,
Please feel free to reach out to myself and Martijn Dullart for your next article on CM.
We have both spent our last 15 years + working on CM processes for MBD.
Thanks Max for your offer – I will do so. Best regards, Jos
LikeLiked by 1 person
November 1, 2021 at 9:15 am
Karden Hakan
Great post Jos. Much of what you say will be presented and to some extent discussed at the upcoming PLM Road Map and PDT. We have all the disruptive technologies and they come with a lot of promises and benefits. But we need to introduce them alongside proper Configuration Management to help them make it through the hype cycle and become mainstream technologies and deliver value at the enterprise level.
Thanks Hakan looking forward to the conference be it that unfortunate the dinner night good for discussions won’t be there. Next year ?
LikeLiked by 1 person
November 1, 2021 at 11:49 am
Lars Taxén
Great post about models! One thing though – we need to get away from the view that models “describe” reality. Models are as real as everything else we can perceive, be that as paper or as a 3D CAD. Models signify something that does not yet exist, which means that there is no reality there to describe. Their purpose is to make people look in the same direction. This change of view of view might seem harmless, but it isn’t. It is a complete turnover of mainstream thinking. If we think of models as describing reality, we look for that reality like someone searching for a treasure. If we think of models as a way to coordinate people’s actions, we try to make models as comprehensible and effective as possible. The decisive difference is then that our attention is redirected to what’s going on in people’s heads – the people issue that is described repeatedly in PLM as the toughest issue.
Thanks for your excellent comment Lars, fully agree that the power of models is to align our understanding – I will use it in my concluding post> Best regards, Jos
LikeLiked by 1 person
November 2, 2021 at 7:38 pm
Bruce Bookbinder
Jos,
Thank you for a very interesting blog. While I agree with almost everything you say about low code, I am surprised about your comments regarding Aras’ low code and how it is part of a progression of earlier low code tools. Living through Clipper and Visual Basic myself, I agree with you on the sort of issues they created but do not agree that Aras’ low code strategy will continue the propagation of these issues. My “Low-Code – What’s Old is New Again (or it is?)” on the Aras Website discusses the same points you brought up but also describes how Aras’ low code approach is different. It significantly reduces the effort to build applications within the Aras Innovator environment by utilizing the platform services and data model of the platform under the controls and governance of an enterprise solution. This helps to create the needed connectivity that people need without creating a shadow IT or adding to the spaghetti (or linguine).
Thanks, Bruce for raising this point and let me try to clarify it better. First of all, I don’t feel comfortable with the low-code hype, as having experienced (and fixed) SmarTeam implementations between 2000 and 2008. I have seen the long-term results of its flexibility. Upgrades to existing capabilities were not possible because some data types already exist as a customization. In the context of Aras, I point to the (mis)use of the part object, which serves various datatypes both for design as well as operations – we can discuss the details offline if you want – SmarTeam suffered similar issues in the early days.
Next, by being flexible you become more an application infrastructure than a PLM infrastructure, Companies would benefit from data models and behaviors that are standardized between the vendors not a new data type. (Here I blame SAP by introducing the Material as Part). Every quick customization creates a legacy, even if it is upgradable as the business context has not been analyzed. I was disappointed to see that a flexible system like Aras also jumped on the low-code hype, there was no need for that.
Flexibility without a future vision or framework holds companies in a grip, unable to transform to a modern digital environment. I am happy to discuss this with you in a face-to-face meeting as we could write many posts about it. Feel free to reach out and discuss.
Best regards, Jos
LikeLiked by 1 person