You are currently browsing the category archive for the ‘SmarTeam’ category.
This month it is exactly 15 years ago that I started my blog, a little bit nervous and insecure. Blogging had not reached mainstream yet, and how would people react to my shared experiences?
The main driver behind my blog in 2008 was to share field experiences when implementing PLM in the mid-market.
As a SmarTeam contractor working closely with Dassault and IBM PLM, I learned that implementing PLM (or PDM) is more than a technology issue.
Discussing implementations made me aware of the importance of the human side. Customers had huge expectations with such a flexible toolkit, and implementers made money by providing customization to any user request.
No discussion if it was needed, as the implementer always said: “Yes, we can (if you pay)”.
The parallel tree
And that’s where my mediation started. At a particular moment, the customer started to get annoyed of again another customization. The concept of a “parallel tree,” a sync between the 3D CAD structure and the BOM, was many times a point of discussion.
So many algorithms have been invented to convert a 3D CAD structure into a manufacturing BOM. Designing glue and paint in CAD as this way it would appear in the BOM.
The “exploded” data model
A result of customizations that ended up in failure were the ones with a crazy data model, too many detailed classes, and too many attributes per class.
Monsters were created by some well-willingly IT departments collecting all the user needs, however unworkable by the end users. See my 2015 post here: The Importance of a PLM data model
The BOM concepts
While concepts and best practices have become stable for traditional PLM, where we talk more about a Product Information backbone, there is still considerable debate about this type of implementation. The leading cause for the discussion is that companies often start from their systems and newly purchased systems and then try to push the people and processes into that environment.
For example, see this recent discussion we had with Oleg Shilovitsky (PLM, ERP, MES) and others on LinkedIn.
These were the days before we entered into digital transformation in the PLM domain, and starting from 2015, you can see in my blog posts the mission. Exploring what a digital enterprise would look like and what the role of PLM will be.
The Future
Some findings I can already share:
- No PLM system can do it all – where historically, companies bought a PLM system; now, they have to define a PLM strategy where the data can flow (controlled) in any direction. The PLM strategy needs to be based on value streams of connected data between relevant stakeholders supported by systems of engagement. From System to Strategy.
- Master Data Management and standardization of data models might still be a company’s internal activity (as the environment is stable). Still, to the outside world/domains, there is a need for flexible connections (standard flows / semantic web). From Rigid to Flexible.
- The meaning of the BOM will change for coordinated structures towards an extract of a data-driven PLM environment, where the BOM mainly represents the hardware connected to software releases. Configuration management practices must also change (see Martijn – and the Rise and Fall of the BOM). From Placeholders to Baselines.
- Digital Transformation in the PLM domain is not an evolution of the data. Legacy data has never been designed to be data-driven; migration is a mission impossible. Therefore there is a need to focus on a hybrid environment with two modes: enterprise backbone (System of Record) and product-centric infrastructure (Systems of Engagements). From Single Source of Truth to Authoritative Source of Truth.
Switching Gears
Next week I have reached the liable age for my Dutch pension, allowing me to switch gears.
Instead of driving in high-performance mode, I will start practicing driving in a touristic mode, moving from points of interest to other points of interest while caring for the environment.
Here are some of the topics to mention at this moment.
Reviving the Share PLM podcast
Together with the Share PLM team, we decided to revive their podcast as Season 2. I referred to their podcast last year in my PLM Holiday thoughts 2022 post.
The Share PLM team has always been the next level of what I started alone in 2008. Sharing and discussing PLM topics with interest on the human side, supporting organizational change through targeted e-learning deliverables based on the purpose of a PLM implementation. People (first), Processes (needed) and the Tools (how) – in this order.
In Season 2 of the podcast, we want to discuss with experienced PLM practitioners the various aspects of PLM – not only success stories you often hear at PLM conferences.
Experience is what you get when you do not get what you expect.
And PLM is a domain where experience with people, processes and tools counts.
Follow our podcast here, subscribe to it on your favorite platform and feel free to send us questions. Besides the longer interviews, we will also discuss common questions in separate recordings or as a structured part of the podcast.
Sustainability!
I noticed from my Sustainability related blog posts that they resonate less with my blogging audience. I am curious about the reason behind this.
Does it mean in our PLM community, Sustainability is still too vague and not addressed in the reader’s daily environment? Or is it because people do not see the relation to PLM and are more focused on carbon emissions, greenhouse gasses and the energy transition – a crucial part of the sustainable future that currently gets much attention?
I just discovered this week I just read this post: CEO priorities from 2019 until now: What has changed? As the end result shows below, sustainability has been ranked #7 in 2019, and after some ups and downs, it is still at priority level #7. This is worrying me as it illustrates that at the board level, not so much has changed, despite the increasing understanding of the environmental impact and the recent warnings from the climate. The warnings did not reach the boardrooms yet.
In addition, I will keep on exploring the relationship between PLM and Sustainability, and in that context, I am looking forward to my learnings and discussions at the upcoming PTC Liveworx event in Boston. Do I see yo there?
Here I hope to meet with their sustainability thought leaders and discuss plans to come up with concrete activities related to PLM and Sustainability.
Somehow it is similar to the relationship between Digital Transformation and the PLM domain. Although we talk already for over 10 years about the digitalization of the entire business; in the PLM domain, it has just started,
Awareness sessions
Companies have a considerable challenge translating a C-level vision into a successful business transformation supported by people active in the field.
Or on the opposite, highly motivated people in the organization see the opportunity to improve their current ways of working dramatically due to digitization.
However, they struggle with translating their deep understanding into messages and actions that are understood and supported by the executive management. In the past ten years, I have been active in various transformational engagements, serving as a “translator” between all stakeholders. I will continue this work as it is a unique way to coach companies, implementers and software vendors to understand each other.
Conclusions
Fifteen years of blogging has brought me a lot – constantly forcing yourself to explain what you observe around you and what it means for the PLM domain. My purpose in sharing these experiences with you in a non-academic matter has led to a great network of people and discussions. Some are very interactive, like Håkan Kårdén and Oleg Shilovitsky (the top two) and others, in an indirect way, provide their feedback.
Switching gears will not affect the blogging and the network – It might even lead to deeper insights as the time to observe and enjoy will be longer.
Keep your seatbelts fastened.
In my previous post, I discovered that my header for this series is confusing. Although a future implementation of system lifecycle management (SLM/PLM) will rely on models, the most foundational change needed is a technical one to create a data-driven infrastructure for connected ways of working.
My previous article discussed the concept of the dataset, which led to interesting discussions on LinkedIn and in my personal interactions. Also, this time Matthias Ahrens (HELLA) shared again a relevant but very academic article in this context – how to harmonize company information.
For those who want to dive deeper into the concept of connected datasets, read this article: The euBusinessGraph ontology: A lightweight ontology for harmonizing basic company information.
The article illustrates that the topic is relevant for all larger enterprises (and it is not an easy topic).
This time I want to share my thoughts about the two statements from my introductory post, i.e.:
A model-based approach with connected datasets seems to be the way forward. Managing data in documents will become inefficient as they cannot contribute to any digital accelerator, like applying algorithms. Artificial Intelligence relies on direct access to qualified data.
A model-based approach with connected datasets
We discussed connected datasets in the previous post; now, let’s explore why models and datasets are related. In the traditional CAD-centric PLM domain, most people will associate the word model with a CAD model, to be more precise, the 3D CAD Model. However, there are many other types of models used related to product development, delivery and operations.
A model can be a:
Physical Model
- A smaller-scale object for the first analysis, e.g., a city or building model, an airplane model
Conceptual Model
- A conceptual model describes the entities and their relations, e.g., a Process Flow Diagram (PFD)
A mathematical model describes a system concept using a mathematical language, e.g., weather or climate models. Modelica and MATLAB would fall in this category
- A CGI (Computer Generated Imagery) or 3D CAD model is probably the most associated model in the mind of traditional PLM practitioners
- Functional and Logical Models describing the services and components of a system are crucial in an MBSE
Operational Model
- A model providing performance analysis based on (real-time) data coming from selected data sources. It could be an operational business model, an asset performance model; even my Garmin’s training performance model is such an operating model.
The list of all models above is not extensive nor academically defined. Moreover, some model term definitions might overlap, e.g., where would we classify software models or manufacturing models?
All models are a best-so-far approach to describing reality. Based on more accurate data from observations or measurements, the model comes closer to what happens in reality.
A model and its data
Never blame the model when there is a difference between what the model predicts and the observed reality. It is still a model. That’s why we need feedback loops from the actual physical world to the virtual world to fine-tune the model.
Part of what we call Artificial Intelligence is nothing more than applying algorithms to a model. The more accurate data available, the more “intelligent” the artificial intelligence solution will be.
By using data analysis complementary to the model, the model may get better and better through self-learning. Like our human brain, it starts with understanding the world (our model) and collecting experiences (improving our model).
There are two points I would like to highlight for this paragraph:
- A model is never 100 % the same as reality – so don’t worry about deviations. There will always be a difference between virtual predicted and physical measured – most of the time because reality has much more influencing parameters.
- The more qualified data we use in the model, the closer to reality – so focus on accurate (and the right) data for your model. Although, as most of the time, it is impossible to fully model a system, focus on the most significant data sources.
The ultimate goal: THE DIGITAL TWIN
The discussion related to data-driven and the usage of models might feel abstract and complex (and that’s the case). However the term “digital twin” is well known and even used in board rooms.
The great benefits of a digital twin for business operations and for sustainability are promoted by many software vendors and consultancy firms.
My statement and reason for this series of blog posts: Digital Twins do not run on documents, you need to have a data-driven, model-based infrastructure to efficiently benefit from digital twin concepts.
Unfortunate a reliable and sustainable implementation of a digital twin requires more than software – it is a learning journey to connect the right data to the right model.
A puzzle every company has to solve as there is no 100 percent blueprint at this time.
Are Low Code platforms the answer?
I mentioned the importance of accurate data. Companies have different systems or even platforms managing enterprise data. The digital dream is that by combining datasets from different systems and platforms, we can provide to any user the needed information in real-time. My statement from my introductory post was:
I don’t believe in Low-Code platforms that provide ad-hoc solutions on demand. The ultimate result after several years might be again a new type of spaghetti. On the other hand, standardized interfaces and protocols will probably deliver higher, long-term benefits. Remember: Low code: A promising trend or a Pandora’s Box?
Let’s look into some of the low-code platform messages mentioned by Low-Code advocates:
You will have an increasingly hard time finding developers to keep up with global app development demands (reason #1 for PEGA)
This statement reminded me of the early days of SmarTeam implementations. With a Data model Wizard, a Form Designer, and a Visual Basic COM API, you could create any kind of data management application with SmarTeam. By using its built-in behaviors for document lifecycle management, item lifecycle management, and CAD integrations combined with easy customizations.
The sky was the limit to satisfy end users. No need for an experienced partner or to be a skilled programmer (this was 2003+). SmarTeam was a low-code platform the marketing department would say now.
A lot of my activities between 2003 and 2010 were related fixing the problems related to flexibility, making sense (again) of customizations. I wrote about this in a 2015 post: The importance of a (PLM) data model sharing the experiences of “fixing” issues created to flexibility.
Think first
The challenge is that an enthusiastic team creates a (low code) solution rapidly. Immediate success is celebrated by the people involved. However, the future impact of this solution is often forgotten – we did the job, right?
Documentation and a broader visibility are often lacking when implementing such a solution.
For example, suppose your product data is going to be consumed by another app. In that case, you need to make sure that the information you consume is accurate. On the other hand, perhaps the information was valid when you created the app.
However, if your friendly co-worker has moved on to another job and someone with different data standards becomes responsible for the data you consume, the reliability might fail. So how do you guarantee its quality?
Easy tools have often led to spaghetti, starting from Clipper (the old days), Visual Basic (the less old days) to highly customizable systems (like Aras is promoting) and future low-code platforms (and Aras is there again).
However, the strength of being highly flexible is also the weaknesses if not managed and understood correctly. In particular, in a digital enterprise architecture, you need skilled people who guarantee a reliable anchorage of the solution.
The HBR article When Low-Code/No-Code Development Works — and When It Doesn’t mentions the same point:
There are great benefits from LC/NC software development, but management challenges as well. Broad use of these tools institutionalizes the “shadow IT” phenomenon, which has bedeviled IT organizations for decades — and could make the problem much worse if not appropriately governed. Citizen developers tend to create applications that don’t work or scale well, and then they try to turn them over to IT. Or the person may leave the company, and no one knows how to change or support the system they developed.
The fundamental difference: from coordinated to connected
For the moment, I remain skeptical about the low-code hype, because I have seen this kind of hype before. The most crucial point companies need to understand is that the coordinated world and the connected world are incompatible.
Using new tools based on old processes and existing data is not a digital transformation. Instead, a focus on value streams and their needed (connected) data should lead to the design of a modern digital enterprise, not the optimization and connectivity between organizational siloes.
Before buying a tool (a medicine) to reduce the current pains, imagine your future ways of working, discover what is possible with your existing infrastructure and identify the gaps.
Next, you need to analyze if these gaps are so significant that it requires a technology change. Probably it does, as historically, systems were not designed to share data horizontally in an organization.
In this context, have a look at Lionel Grealou’s s article for Engineering.com:
Data Readiness in the new age of digital collaboration.
Conclusion
We discussed the crucial relation between models and data. Models have only value if they acquire the right and accurate data (exercise 1).
Next, even the simplest development platforms, like low-code platforms, require brains and a long-term strategy (exercise 2) – nothing is simple at this moment in transformational times.
The next and final post in this series will focus on configuration management – a new approach is needed. I don’t have the answers, but I will share some thoughts
A recommended event and an exciting agenda and a good place to validate and share your thoughts.
I will be there and look forward to meeting you at this conference (unfortunate still virtually)
To understand our legacy in the PLM-domain, what are the types of practices we created, I started this series of posts: Learning from the past to understand the future. My first post (The evolution of the BOM) focused on the disconnected world between engineering – generation of drawings as a deliverable – and execution MRP/ERP – the first serious IT-systems in a company.
At that time, due to minimal connectivity, small and medium-sized companies had, most of the time, an informal connection between engineering and manufacturing. I remember a statement at that time, PLM was just introduced. One person during a conference claimed:
“You guys make our lives so difficult with your systems. If we have a problem, we gather around the machine, and we fix it.”
PLM started at large enterprises
Of course, large enterprises could not afford such behavior as they operate globally. The leading enterprises for PDM/PLM were the Aerospace & Defense and Automotive companies. They needed consistent processes and formal ways of working to guarantee quality output.
In that sense, I was happy with the reaction from Jean-Jacques Urban-Galindo, who shared in the LinkedIn comments a reference to a relevant chapter of John Stark’s PLM book. In the pdf describing the evolution of CAD / PDM / PLM at PSA. Jean-Jacques was responsible at that time for Responsible for the re-engineering of the Product & Process Engineering processes using digital tools (CAD/CAM, DMU, and more).
Read the PSA story here: PLM at GROUPE PSA. It describes nicely where 3D CAD and EBOM are coming in. In large enterprise like PSA, the need for tools are driven by the processes. When you read it to the end, you will also see the need for a design and a manufacturing view. A topic I will touch in future posts too.
The introduction of 3D CAD in the mid-market
Where large automotive and aerospace companies already invested in (expensive) 3D CAD hard and software, for the majority of the midsize companies, the switch from 2D CAD (AutoCAD mainly) towards 3D CAD (SolidWorks, Solid Edge, Inventor) started at the end of the 20th century.
It was the time that Microsoft NT became a serious platform beside the existing mainframe and mini-computer based CAD-systems. The switch to PCs went so fast that the disruption from DEC (Digital Equipment Company) is one of the cases discussed by Clayton Christensen in his groundbreaking book: The Innovator’s dilemma
3D CAD introduced a lot of new capabilities, like DMU (Digital Mock-Up), for clash detection, and above all, a better understanding of a product’s behavior. The introduction of 3D CAD introduced a new set of challenges to be resolved.
For example, the concept of reusing 3D CAD parts. Mid-market companies, most of the time, are buying productivity tools. Can I design my product faster and with higher quality in 3D instead of using only the 2D definitions?
Mid-market companies usually do not redesign their business processes – no people available for strategy – the pain of lack of strategy is felt in a different way compared to large enterprises—a crucial differentiator for the future of PLM.
Reuse of (3D) CAD parts / Assemblies
In the 2D CAD world, there was not so much reuse of CAD parts. Standard parts were saved in libraries or generated on demand by parametric libraries. Now with 3D CAD, designers might spend more time to define the part. The benefits come from the reuse of small sub-assemblies (modules) into a larger product assembly. Something not relevant in the 2D CAD world.
As every 3D CAD part had to have a file name, it became difficult to manage the file names without a system. How do you secure that the file with name Part01.xxx is unique? Another designer might also create an assembly, where the 3D CAD tool would suggest Part01.xxx as the name. And what about revisions? Do you store them in the filename, and how do you know you have the correct and latest version of the file?
Companies had already part naming rules for drawings, often related to the part’s usage similar to “intelligent” numbers I mentioned in my previous post.
With 3D CAD it became a little more complicated as now in electronic formats, companies wanted to maintain the relation:
Drawing ID = Part ID = File Name
The need for a PDM-system,
If you look to the image on the left, which I found in one of my old SmarTeam files, there is a part number combined with additional flags A-A-C, which also have meaning (I don’t know ☹ ) and a description.
The purpose of these meaningful flags was to maintain the current ways of working. Without a PDM-system, parts of the assembly could be shared with an OEM or a supplier. File-based 3D CAD without using a PDM-system was not a problem for small and medium enterprises.
The 3D CAD-system maintained the relations in the assembly files, including relations to the 2D Drawings. Despite the introduction of 3D CAD, the 2D Drawing remained the deliverable the rest of the company or supply chain, was waiting for. Preferably a drawing containing a parts list and balloon numbers, the same as it has been done before. Why would you need a PDM-system?
PDM for traceability and reuse
If you were working in your 3D CAD-system for a single product, or on individual projects for OEMs, there was no significant benefit for a PDM-system. All deliveries needed for the engineering department were in the 3D CAD environment. Assembly files and drawing files are already like small databases, containing references to the source files of the part (image above).
A PDM-system at this stage could help you build traceability and prevent people from overwriting files. The ROI for this part only depends on the cost and risks of making mistakes.
However, when companies started to reuse parts or subassemblies, there was a need for a system that could manage the 3D models separately. This had an impact on the design methodology.
Now parts could be used in various products. How do you discover parts for reuse, and how do you know you have the last released version. For sure their naming cannot be related anymore to a single product or project (a practice still used a lot)
This is where PDM-systems came in. Using additional attributes per file combined with relations between parts, allowing companies to structure and deliver more details related to a part. A detailed description for internal usage, a part type (classification), and the part material were commonly used attributes. And not to forget the status and revision.
For reuse, it was important that the creators of content had a strategy to define a part for future reuse or discovery. Engineerings were not used to provide such services, filling in data in a PDM-system was seen as an overhead – bureaucracy.
As they were measured on the number of drawings they produced, why do extra work with no immediate benefits?
The best compromise was to have the designer fill in properties in the CAD-file when creating a part. Using the CAD-integration with the PDM-system could be used to fill attributes in the PDM-system.
This “beautiful” simple concept lead later to a lot of complexity.
Is the CAD-model the source of data, meaning designers should always start from CAD when designing a product. If someone added or modified data in the PDM-system, should we open the CAD-file to update some properties? Changing a file means it is a new version. What happens if the CAD-file is released, and I update some connected attributes in PDM?
To summarize this topic. Companies have missed the opportunity here to implement data governance. However, none of the silos (manufacturing preparation, service) recognized the need. Implementing new tools (3D CAD and PDM) did not affect the company’s way of working.
Instead of people, processes, tools, the only focus was on new tools and satisfying the people withing the same process.
Of course, when introducing PDM, which happened for mid-market companies at the beginning of this century, there was no PLM vision. Talking about lifecycle support was a waste of time for management. As we will discover in the future posts, large enterprises and small and medium enterprises have the same PLM needs. However, there is already a fundamentally different starting point. Where large enterprises are analyzing and designing business processes, the small and medium enterprises are buying tools to improve the current ways of working
The Future?
Although we have many steps to take in the upcoming posts, I want to raise your attention to an initiative from the PLM Interest Group together with Xlifecycle.com. The discussion is about what will be PLM’s role in digital transformation.
As you might have noticed, there are people saying the word PLM is no longer covering the right context, and all kinds of alternatives have been suggested. I recommend giving your opinion without my personal guidance. Feel free to answer the questionnaire, and we will be all looking forward to the results.
Find the survey here: Towards a digital future: the evolving role of PLM in the future digital world
Conclusion
We are going slow. Discovering here in this post the split in strategy between large enterprises (process focus) and small and medium enterprises (tool focus) when introducing 3D CAD. This different focus, at this time for PDM, is one of the reasons why vendors are creating functions and features that require methodology solving – however, who will provide the methodology.
Next time more on 3D CAD structures and EBOM
In my previous post, the PLM blame game, I briefly mentioned that there are two delivery models for PLM. One approach based on a PLM system, that contains predefined business logic and functionality, promoting to use the system as much as possible out-of-the-box (OOTB) somehow driving toward a certain rigidness or the other approach where the PLM capabilities need to be developed on top of a customizable infrastructure, providing more flexibility. I believe there has been a debate about this topic over more than 15 years without a decisive conclusion. Therefore I will take you through the pros and cons of both approaches illustrated by examples from the field.
PLM started as a toolkit
The initial cPDM/PLM systems were toolkits for several reasons. In the early days, scalable connectivity was not available or way too expensive for a standard collaboration approach. Engineering information, mostly design files, needed to be shared globally in an efficient manner, and the PLM backbone was often a centralized repository for CAD-data. Bill of Materials handling in PLM was often at a basic level, as either the ERP-system (mostly Aerospace/Defense) or home-grown developed BOM-systems(Automotive) were in place for manufacturing.
Depending on the business needs of the company, the target was too connect as much as possible engineering data sources to the PLM backbone – PLM originated from engineering and is still considered by many people as an engineering solution. For connectivity interfaces and integrations needed to be developed in a time that application integration frameworks were primitive and complicated. This made PLM implementations complex and expensive, so only the large automotive and aerospace/defense companies could afford to invest in such systems. And a lot of tuition fees spent to achieve results. Many of these environments are still operational as they became too risky to touch, as I described in my post: The PLM Migration Dilemma.
The birth of OOTB
Around the year 2000, there was the first development of OOTB PLM. There was Agile (later acquired by Oracle) focusing on the high-tech and medical industry. Instead of document management, they focused on the scenario from bringing the BOM from engineering to manufacturing based on a relatively fixed scenario – therefore fast to implement and fast to validate. The last point, in particular, is crucial in regulated medical environments.
At that time, I was working with SmarTeam on the development of templates for various industries, with a similar mindset. A predefined template would lead to faster implementations and therefore reducing the implementation costs. The challenge with SmarTeam, however, was that is was very easy to customize, based on Microsoft technology and wizards for data modeling and UI design.
This was not a benefit for OOTB-delivery as SmarTeam was implemented through Value Added Resellers, and their major revenue came from providing services to their customers. So it was easy to reprogram the concepts of the templates and use them as your unique selling points towards a customer. A similar situation is now happening with Aras – the primary implementation skills are at the implementing companies, and their revenue does not come from software (maintenance).
The result is that each implementer considers another implementer as a competitor and they are not willing to give up their IP to the software company.
SmarTeam resellers were not eager to deliver their IP back to SmarTeam to get it embedded in the product as it would reduce their unique selling points. I assume the same happens currently in the Aras channel – it might be called Open Source however probably it is only high-level infrastructure.
Around 2006 many of the main PLM-vendors had their various mid-market offerings, and I contributed at that time to the SmarTeam Engineering Express – a preconfigured solution that was rapid to implement if you wanted.
Although the SmarTeam Engineering Express was an excellent sales tool, the resellers that started to implement the software began to customize the environment as fast as possible in their own preferred manner. For two reasons: the customer most of the time had different current practices and secondly the money come from services. So why say No to a customer if you can say Yes?
OOTB and modules
Initially, for the leading PLM Vendors, their mid-market templates were not just aiming at the mid-market. All companies wanted to have a standardized PLM-system with as little as possible customizations. This meant for the PLM vendors that they had to package their functionality into modules, sometimes addressing industry-specific capabilities, sometimes areas of interfaces (CAD and ERP integrations) as a module or generic governance capabilities like portfolio management, project management, and change management.
The principles behind the modules were that they need to deliver data model capabilities combined with business logic/behavior. Otherwise, the value of the module would be not relevant. And this causes a challenge. The more business logic a module delivers, the more the company that implements the module needs to adapt to more generic practices. This requires business change management, people need to be motivated to work differently. And who is eager to make people work differently? Almost nobody, as it is an intensive coaching job that cannot be done by the vendors (they sell software), often cannot be done by the implementers (they do not have the broad set of skills needed) or by the companies (they do not have the free resources for that). Precisely the principles behind the PLM Blame Game.
OOTB modularity advantages
The first advantage of modularity in the PLM software is that you only buy the software pieces that you really need. However, most companies do not see PLM as a journey, so they agree on a budget to start, and then every module that was not identified before becomes a cost issue. Main reason because the implementation teams focus on delivering capabilities at that stage, not at providing value-based metrics.
The second potential advantage of PLM modularity is the fact that these modules supposed to be complementary to the other modules as they should have been developed in the context of each other. In reality, this is not always the case. Yes, the modules fit nicely on a single PowerPoint slide, however, when it comes to reality, there are separate systems with a minimum of integration with the core. However, the advantage is that the PLM software provider now becomes responsible for upgradability or extendibility of the provided functionality, which is a serious point to consider.
The third advantage from the OOTB modular approach is that it forces the PLM vendor to invest in your industry and future needed capabilities, for example, digital twins, AR/VR, and model-based ways of working. Some skeptic people might say PLM vendors create problems to solve that do not exist yet, optimists might say they invest in imagining the future, which can only happen by trial-and-error. In a digital enterprise, it is: think big, start small, fail fast, and scale quickly.
OOTB modularity disadvantages
Most of the OOTB modularity disadvantages will be advantages in the toolkit approach, therefore discussed in the next paragraph. One downside from the OOTB modular approach is the disconnect between the people developing the modules and the implementers in the field. Often modules are developed based on some leading customer experiences (the big ones), where the majority of usage in the field is targeting smaller companies where people have multiple roles, the typical SMB approach. SMB implementations are often not visible at the PLM Vendor R&D level as they are hidden through the Value Added Reseller network and/or usually too small to become apparent.
Toolkit advantages
The most significant advantage of a PLM toolkit approach is that the implementation can be a journey. Starting with a clear business need, for example in modern PLM, create a digital thread and then once this is achieved dive deeper in areas of the lifecycle that require improvement. And increased functionality is only linked to the number of users, not to extra costs for a new module.
However, if the development of additional functionality becomes massive, you have the risk that low license costs are nullified by development costs.
The second advantage of a PLM toolkit approach is that the implementer and users will have a better relationship in delivering capabilities and therefore, a higher chance of acceptance. The implementer builds what the customer is asking for.
However, as Henry Ford said, if I would ask my customers what they wanted, they would ask for faster horses.
Toolkit considerations
There are several points where a PLM toolkit can be an advantage but also a disadvantage, very much depending on various characteristics of your company and your implementation team. Let’s review some of them:
Innovative: a toolkit does not provide an innovative way of working immediately. The toolkit can have an infrastructure to deliver innovative capabilities, even as small demonstrations, the implementation, and methodology to implement this innovative way of working needs to come from either your company’s resources or your implementer’s skills.
Uniqueness: with a toolkit approach, you can build a unique PLM infrastructure that makes you more competitive than the other. Don’t share your IP and best practices to be more competitive. This approach can be valid if you truly have a competing plan here. Otherwise, the risk might be you are creating a legacy for your company that will slow you down later in time.
Performance: this is a crucial topic if you want to scale your solution to the enterprise level. I spent a lot of time in the past analyzing and supporting SmarTeam implementers and template developers on their journey to optimize their solutions. Choosing the right algorithms, the right data modeling choices are crucial.
Sometimes I came into a situation where the customer blamed SmarTeam because customizations were possible – you can read about this example in an old LinkedIn post: the importance of a PLM data model
Experience: When you plan to implement PLM “big” with a toolkit approach, experience becomes crucial as initial design decisions and scope are significant for future extensions and maintainability. Beautiful implementations can become a burden after five years as design decisions were not documented or analyzed. Having experience or an experienced partner/coach can help you in these situations. In general, it is sporadic for a company to have internally experienced PLM implementers as it is not their core business to implement PLM. Experienced PLM implementers vary from size and skills – make the right choice.
Conclusion
After writing this post, I still cannot write a final verdict from my side what is the best approach. Personally, I like the PLM toolkit approach as I have been working in the PLM domain for twenty years seeing and experiencing good and best practices. The OOTB-box approach represents many of these best practices and therefore are a safe path to follow. The undecisive points are who are the people involved and what is your business model. It needs to be an end-to-end coherent approach, no matter which option you choose.
The last month I have been working with Aerosud Aviation in South Africa to finalize and conclude on ROI and the lessons learned around their PLM implementation, which started in May 2007. I was lucky to be involved in the initial scoping of the project in 2007 and assisted the local Value Added Reseller together with the team from Dassault Systèmes UK team in a step by step project towards PLM.
When I met the people in Aerosud the first time in 2007, I noticed it was a young company, with open-minded people, everyone trying to improve their daily activities per department. There was the need for PLM as some of their major customers required Aerosud to have a PLM system in place. Also Configuration Management was mentioned many times in the interviews and what I learned that time: Excel was the tool for configuration management.
Based on the initial interviews a plan needed to be developed in which steps to implement PLM. The following three major points were the guidance for the implementation:
- The company was thinking documents and understanding documents especially Excel
- The company had no clear understanding of what PLM would mean for them as real awareness was not inside the company. Customers like Boeing and Airbus talked about the importance of PLM, but how this could impact Aerosud as a company was no commonly clear
- People in the company had a major focus on their department and there was no availability of a overarching group of people leading the implementation
You could say you will see the above points in many smaller and medium-sized companies. I wrote about it also in one of my previous posts: Where does PLM start beyond document management ?
The project phases
The good news for Aerosud was that their PLM Champion was an expert in CATIA and was familiar with writing macros in Visual Basic plus the fact that everyone in the company was open for using the system as standard as possible – no demands for special behavior of the system: “because we do this already for 100 years”
The last phrase you hear a lot in ancient Europe
The choice was to start with implementing ENOVIA SmarTeam Design Express and to focus in two phases around design data management (phase 1) and the usage of design data by other users (phase 2)
The plan was that each phase would take maximum 2-3 months and we would give the users the time to digest and change their habits towards the standards in the system. In reality it took almost a year, not due to technical or conceptual issues, but this was the maximum pace we could have with the amount of time and available resources. The good news after these two phases was that the first bullet was much clearer understood – the difference between having a system with a single version of the truth or Excel management.
In the summer of 2008 (our summer – as it was winter in South Africa) there was a management workshop in Aerosud and here after three days of discussion the position of PLM became clear. One year ago this would not have been possible, now people had seen ENOVIA SmarTeam and they could imagine what benefits the system could further bring. This addressed the second bullet I mentioned before. Although this workshop was not scheduled upfront, looking back now I see this was a crucial point to get understanding for the next PLM steps.
The next PLM steps were extending to a real Item-centric data model, because if you want to do PLM you need to work around Bill of Materials and all related information to the items in the Bill of Material. At the end this gives you configuration management without chasing Excels.
Again the next steps were divided in two phases with again a scope of 2 – 3 months. The implementation would be based on the ENOVIA SmarTeam Engineering Express methodology which came as a logic extension of the current implementation, without having to change the database or existing data model.
In the first phase we had awareness sessions for BOM (discussing EBOM / MBOM / Effectivity, etc) plus in parallel we introduced the item as place holder for the information. Not longer folders or projects as the base.
Introduction of the item was conceptual not a big issue and the major activities in this phase were focused on connection legacy data or current data from projects to the items. Data coming from various sources (directories, legacy databases) plus NC data became connected and visible in the single version of truth.
In the second phase of moving to PLM the focus was on EBOM and MBOM. Initially assuring that from the designer point of view the CATIA design and EBOM were connected as smoothly as possible, trying to avoid a lot of administrative overhead on the designer (sometimes unavoidable – see my previous post: Where is my ROI, Mr. Voskuil)
After having implemented a streamlined CATIA – EBOM connection, the focus moved to the MBOM. For me this is the differentiator for companies if they implement PLM or just Product Data Management). Implementing the MBOM requires a culture change and this is the place where the ERP people need to see the benefits instead of the threats . Luckily in Aerosud the manufacturing engineers were working in their Excels initially and not in the ERP system – which happens a lot in older companies.
For that reason the concept of MBOM in PLM was much better understood. Now Aerosud is experiencing these capabilities and once they become obvious for everyone the third bullet will be addressed: people start to work in processes cross-departmental instead of optimizing their department with a specific tool.
As this activity will continue, I also conducted with the Aerosud management and PLM implementation team an ROI assessment. Estimates about the experienced and projected benefits were kept low and on the realistic side. The result was that the outcome for the ROI period was approx 27 months, almost the same time as the whole project had as throughput time. This proved again the statement about a phased PLM approach. payback of project comes in parallel with the implementation and will ultimately fund the next steps.
End of July I will be holding a webinar with more details about this implementation for the Dassault VAR Community. I will be happy to expand this information for a wider audience afterwards, as I believe the project is representative for many mid-market companies that struggle to find the place where PLM fits ….. and brings ROI
Let me know if you are interested in this follow up and I will collect the inputs for a follow up.
In the past year I shared with you my thoughts around PLM. Most of the post were based on discussions with customers, implementers, resellers and peers around the world. I learned a lot and will keep on learning I assume, as PLM has many aspects:
– the products, there are many products with the label PLM
– the concept, how do we interpret PLM per industry
– the customers, what do they want to achieve, without buzz-word
– the world, people and economic trends drive us sometime to irrational decisions
In this post I will give an overview from the 2008 posts, categorized by topic. I am looking forward to further suggestions in the comments if you are interested in more depth in certain areas. In parallel I will continue to share my experiences and provide an overview of best-practices and terminology experienced in the PLM space.
PLM concepts
Managing the MBOM is crucial for PLM
Is there a need for classification – and how should it be done ?
Is the PLM concept applicable for mid-market companies too ?
What will happen with PLM – looking towards 2050
PLM and ERP
PLM and ERP – the culture change, continued
Connecting PLM and ERP – part 1, part 2, part 3
PLM and ROI
Implementing PLM is too costly ?
Implementing PLM takes too long ?
Why implement PLM next to an ERP system ?
How is PLM different from CAD data management ?
Economical crisis creates the opportunity for change
Business Process Change
PLM in SMB requires a change in thinking
The management is responsible to initiate a change towards PLM
The change in automotive/aero supply chains to more advanced partners
How will mid-market companies pick-up the benefits from implementing PLM ?
Experiences
European Enovia Customer Conference (ECC)
PLM in Greece – does it exist ?
Is the concept for PLM mature enough ?
Don’t expect a bottom up PLM implementation to become successful
Conclusion
I would like to conclude with a quote from my favorite scientist, who taught us everything is relative, however:
“We can’t solve problems by using the same kind of thinking we used when we created them.”
Looking forward to your feedback, wishes in 2009 !
Jos Voskuil
The past month I was involved in a two ENOVIA SmarTeam projects, where both had the target to become the company’s PLM system. However the way these projects were executed lead to the conclusion that the first one was probably going to fail as a PLM system, were the second project was going to be successful.
And only by looking back to the history of the first implementation, it became clear what prevented it from becoming implemented as a PLM system. It had all to do with a bottom-up approach and a top-down approach. I guess ENOVIA SmarTeam is one of the few products that allows a customer to make a choice between bottom-up or top-down.
Somehow also Jim Brown’s post was in line with this observation, but judge yourself.
Most classical PLM systems require a top-down approach as the PLM scope requires departments to work in a different way and to enforce a change on the organization. Organizational change usually only happens top-down based on the vision of the management.
ENOVIA SmarTeam however has the option to be implemented as a CAD data management system, managing the Product Data in the form of documents. This brings a lot of value to the engineering department and depending on the PLM awareness of the company they might try to replace the Excel based Bill Of Materials into a BOM inside the system. As we are working in the scope of engineering this is in most of the cases the Engineering BOM.
There are also other CAD data management systems that claim to be an enterprise PDM system as they manage the product data (usually only the native CAD data) and the engineering BOM. As these systems do not contain capabilities to become an enterprise PLM system, it will be clear for the organization, where to position it – and to keep it in the engineering department.
There are engineering managers in mid-market companies that have the PLM vision and this was the case in the first implementation I mentioned. As his initial mission was to manage the product data based on SolidWorks and AutoCAD, the company decided that ENOVIA SmarTeam was the best multi-CAD data management solution for the company. Meanwhile the engineering manager had the hope (or dream) that once this implementation was completed all other departments would stand in a queue to get connected to ENOVIA SmarTeam too………
…. and this did not happen. Why ?
The main reason for that was that at the time the management had understood the PLM benefits and considered implementing PLM, they looked at SmarTeam and it was implemented too much as an engineering solution, too rich in functionality (and complexity) to be used and integrated by other departments. But when the company was looking to an PLM extension from their ERP system, the engineers refused to work with that system, as according to their opinion the system did not support their needs.
How could this be prevented ?
This was done exactly in the second project. Also here the implementation started in the engineering department, but from the start it was clear for the management, that they would extend the implementation towards a full cross-departmental PLM implementation. The main difference was that the implementation was not focused on satisfying the designers, but from the start it was clear ENOVIA SmarTeam should be useful for other departments too. This implicated less customization on the existing product, more standard functionality. Yes, the designer had to change their way of working as they worked file-based before. But as the focus of the implementation was always on providing data access across the organization, the system remained attractive for the production planning and manufacturing people. It was not an engineering tool only.
Additionally the standard ENOVIA SmarTeam system required from all departments adaptations to their working methods, but as it was not heavily customized, it was much easier to extend the scope beyond engineering.
So what is the conclusion:
- Do not try to build the ultimate engineering solution as step 1 in a PLM project. Remain with the core capabilities.
- Keep the focus on storing information in such a way that it becomes usable for departments outside engineering. This requires less detailed data and more reporting capabilities
- Do not hide the intentions to the management that ENOVIA SmarTeam can become the company’s PLM system. Make the management aware of that but also explain the benefits of a step-by-step implementation, starting with engineering and expanding when the time is ripe
- It would not be the first time that ENOVIA SmarTeam was the best kept secret for the management. The engineering department was happy, but no-one made the effort to explain the full capabilities to the top management
And now a small advertisement add the end
The ENOVIA SmarTeam Express offering allows a customer to start design centric (SDE = SmarTeam Design Express) and to extend the scope step by step by applying engineering capabilities extending the scope from Concept to Manufacturing (SNE = SmarTeam Engineering Express), guiding a bottom-up implementation step-by-step.
The last two weeks I spent around two events for the automotive industry. First the SAE event in Chicago and this week the COE Automotive in Detroit to give a lecture around the future possibilities of a supply chain in a web 2.0 (PLM 2.0) world. For many of the lower tiers suppliers in the automotive supply chain this seems to be something far from their daily business. I guess one of the issues here is, that these companies are used to solve their problems per department, without having a corporate vision or strategy where the company should be in five years from now.
And here I see many challenges (in Europe we would call them possible problems). As the smaller mid-market companies try to solve their problems per department, you will find all around the world bright engineering managers who conclude that their company needs PLM. As they understand all the engineering challenges, they understand that in order to really understand what their department is doing, they should work in a different way than file based.
This is what companies working file-based think
When working file based companies rely on the following main contributors for getting information (in order of importance)
- we do not need these expensive solutions for PLM etc …
- the most important is the experienced engineer who knows what has been done in the past and where to possible find it
- the company directory structure which allows everyone to find and store data related to a customer, project or product
- the file name of the designs and documents which ‘exactly’ describes what’s inside the file
You just need to follow this order and you will always find the right information (or be close to it).
..and these are the issues they do not tell you.
- I guess we really do not know what to do with PLM as we never studied it, what it would be for our company
- we cannot bypass our experienced engineers – although at a certain moment they will retire, currently they would feel very insecure if we tried to collect their explicit knowledge and make it available for all. They would feel their jobs are less secure
- there are some issues with this directory structure. Sometime someone deletes or overwrites a file that we needed, and of course we are not sure if all the data we need is really there. We always need to double check with the people to be sure – and sometimes it hurts, but we are used to it
- or people are creative that only they understand what is in their own files and even from the file name, which can be long, we do not fully understand where it fits, what is the status and where is it also used.
Seeing these two opposite messages, we need to understand what are the challenges for these companies in the near future.
Challenges for these companies
The current workforce is aging all around the world – i recently read that although many believe China is the next promising country for the future, due the the one-child-per-family strategy in the past, they also will face in the near future (10-20 years) the same problems Europe and the US will have.
A huge part of the population will retire and especially in Europe and the US with this retirement a lot of real knowledge will disappear. The new generation will come with different skills, a different background and attitude to engineering. And due to the difference in attitude there is little or no communication between these generations.
So if you are an (aging) manager in a mid-market company in an automotive supply chain, you have two options to react:
- you become fatalistic and believe that the new world is bad and you cling as long as possible to the old habits you are familiar with
or
- or you understand every few decades a change in the way of working is required, which means moving away for the traditional knowledgeable people with their files to an internal, knowledge sharing environment where everyone has access to understand what exists and in which status it is.
So only one conclusion
Survival for the future requires a change in the way these companies are working. It reminds me of the boiling frog story. We do not see the world is changing around us, till it is too late. I guess human beings should be more clever than frogs and they are able to collect information from outside their ‘pan’.
Working with ENOVIA SmarTeam solutions, in particular the Design Express solution, I learned that this solution is an excellent entry point to move away from file based work towards data management.
Still not convinced ? Challenge me by adding a comment (public exposure) or sent me a private email for a one-to-one discussion
As there are many engineering managers who believe that they understood the issue and started to implement an implement a PLM solution in their department, I will address in my next post they challenges they face with this bottom-up approach to convince the company PLM is unavoidable
Below just a goodie to enjoy
This week was a week full of discussion with customers and VARs (Value Added Resellers) around PLM, PDM and implementation approaches and I will come back on this topic in an upcoming post. First I want to conclude the sequel on reasons why companies believe they should not implement PLM.
The 5 reasons not to implement PLM I heard the most were:
- The costs for a PLM implementation are too high
- A PLM implementation takes too long
- We already have an ERP system
- Isn’t PLM the same as managing CAD files ?
- We are so busy, there is no time to have a PLM implementation in our company
And now, we reached #4
4. Isn’t PLM the same as managing CAD files ?
As most of our customers do not have the time to study all the acronyms that exist in our business, it is understandable that it leads to a different interpretation as expected. In non-academic language I will roughly outline the differences.
In the eighties when most of the mid-market companies designed their products in 2D, bigger enterprises were investing in 3D CAD. In parallel these companies were working on concepts to manage all their engineering data in a central place.EDM (Engineering Data Management) was the word in fashion that time. We have to realize that networks were not as affordable as nowadays and that there was no Internet. It was the first concept to centralize and manage engineering data (files – no paper drawings). An EDM system was of course a system purely for the engineering department.
More and more companies started to expand the scope of data managed, it became the central place to store product related information plus being an infrastructure to collaborate on product data. The acronyms PDM (Product Data Management) and cPDM (collaborative Product Data Management) became in fashion in the nineties. A PDM system still focuses on the engineering department but no multi-discipline and if available in dispersed locations.
In 2000 the focus of PDM was again expanded to other departments in the company working on the product in different lifecycle stages. Instead of a static data management environment, it became a target to connect all departments working on the product through its lifecycle. By having all departments connected, the focus could switch to the process. The acronym PLM (Product Lifecycle Management) was introduced and this created a lot more areas of interest:
- connecting the bidding phase and concept phase with feedback from production and the field.
- bringing the sourcing of parts and suppliers forward in the product lifecycle
- testing and planning on a virtual product
- and more
But what should be clear from the scope of PLM compared to PDM and EDM, that it has become a cross-departmental approach and not only a system to enhance the way engineering departments work.
PLM is a strategic approach to enable innovation, better portfolio management and response to the market. The focus is on changing the traditional way of working into an approach where the process is as lean as possible still providing flexibility to adapt to global changes – changing customer demands, changing business situations.
Overview
EDM | Focus mainly on centralizing mechanical design data in an engineering department – mainly files |
PDM | Focus mainly on centralizing product related data in an engineering department – files, BOMs, etc |
PLM | Focus on the product development lifecycle cross departments and locations – files, BOMs, processes, resources. |
Conclusion
No, it is not the same, where managing CAD files is mainly an engineering department related activity which can be solved by a product, PLM is a cross organization approach which requires a PLM system as enabler to implement various best practices
This time a short post, I am off to the ECCAP (September 9-10) to meet customers, implementers and peers all around ENOVIA
Adiosu
Last week I was in Greece together with the Dassault Systems Value Added Reseller OVision. Everyone would expect from the first sentence I was on holiday. Yes I agree, the settings were holiday like always temperatures above 35 C (approx 100 F) and never far from the see.
However,……
….we were visiting ENOVIA SmarTeam prospects and discussed existing customer specific implementation wearing business suits – not wearing shorts. However the most interesting issue was, that we were working with companies that were in the early stages of data management.
If you look around the world, to my understanding, and would rank countries on PLM awareness and need for data management, I would rank Western Europe, Scandinavia, Japan as the countries where concepts for PLM are understood, although in many mid-market companies I would still expect on the long term a culture change to real PLM. In my previous posts, I addressed several thoughts on that.
North America and the United Kingdom I rank differently, as somehow, there are big PLM implementations, but the majority of mid-sized companies is supplier of an OEM network or sees no return on investment on a PLM implementation
Then I would rank countries like Turkey, South Africa, India, and China as the next level. As they participate in manufacturing of global companies – mainly automotive and aerospace, they are driven into the basic needs of PDM as requirement from the OEMs. This pushes in parallel the country’s infrastructure – Internet / Intranet availability.
At the fourth position, I would rank a country like Greece. As due to the local economy there is not a focus on manufacturing or a huge participation in a global supply chain, they have to introduce their data management, growing to PDM or PLM slowly on a still developing infrastructure
Disclaimer: Countries not mentioned here can fall in any of the above categories (or even below). The fact that I did not mention them, is because I have not enough experience working with these countries to judge.
Back to Greece
Apparently, due to all the beautiful islands in Greece, there are thousands of ferries traveling from island to island or other Mediterranean destinations. For that reason, there are companies that build ships, companies that refurbish ships and companies that maintain ships.
At the end, a ferryboat can be seen like single process plant. Like in a plant, you have equipment that needs to be operational and maintained during operation.
This requires a well-defined form of data management, often driven by quality processes around ISO 900x.
Companies often consider quality processes as a kind of document management. You have your manuals with procedures, templates spread around the company, and you update them before the next audit. Everyone is supposed to follow the procedures and supposed to know the latest procedures.
This is a labor-intensive activity if you want to execute as best as possible. In companies where the cost of labor is an issue, you will see that most people are loaded with work and usually the quality issue is the last activity these people will execute, first the operational issues then the rest.
In order to improve the quality of the information, document management and workflow processes are functionalities used to address the availability of the documents and the workflow ensures information to be pushed and published in a guaranteed manner.
Instead of pushing the information to all the users, the company is now able to centralize the data and users can pull the latest information from the system. The workflow processes and the document management system guarantee the right steps are followed and you are always looking to the latest versions. Also you are aware of on-going changes.
When it comes to ships however, there is more to address than ISO documentation and procedures. The ship itself has maintenance or refurbishing projects running on certain systems or locations in the ship. Here the advantages of a PDM system like ENOVIA SmarTeam appear. In the ENOVIA SmarTeam data model you are able to manage information (CAD documents and Bills or Materials too) related to a project, to a ship, to a location or system in the specific ship. There is no need for keywords on the document to describe where it applies, or have copies from a document because if applier to several ships. The data model below shows the types of information that can be stored around a ship.
Once the company has the vision, what to achieve in the upcoming years, a roadmap can be defined. Keeping user understanding, flexibility but still a continued move towards the PDM data model are parameters for the management to monitor and drive. Companies that build or refurbish ships of course have even higher needs to integrate their engineering activities with the ships maintenance data. This avoids a costly hand-over of data that already could be available in the right format.
Conclusion: Although Greece is in the fourth rank of PLM needs and awareness, the benefits to gain from PLM are there too, however due to awareness and infrastructure, they are not as visible as in the countries ranked as number one.
As Greece is the birthplace of many sciences, I am sure the awareness for where to apply PLM concepts is for sure something they will achieve.
If it was easy, anyone could do it. It's hard. It's supposed to be hard. Quote inspired by Tom Hanks…
Jos, what a ride you have had! And looking at some of the spaghetti system architectures of even today's businesses,…
Congratulations, Jos! I'm very happy that you'll stay active in the PLM world and continue with your blogs - during…
Jos, welcome to the world of (part-time) retirement. Enjoy your AOW. Thanks Dick, you have the experience now - enjoy…
Thanks for all the valuable thoughts you have shared with us Jos, hope your 'new career' will bring you lots…