You are currently browsing the tag archive for the ‘Innovation’ tag.

In my previous post, the PLM blame game, I briefly mentioned that there are two delivery models for PLM. One approach based on a PLM system, that contains predefined business logic and functionality, promoting to use the system as much as possible out-of-the-box (OOTB) somehow driving toward a certain rigidness or the other approach where the PLM capabilities need to be developed on top of a customizable infrastructure, providing more flexibility. I believe there has been a debate about this topic over more than 15 years without a decisive conclusion. Therefore I will take you through the pros and cons of both approaches illustrated by examples from the field.

PLM started as a toolkit

The initial cPDM/PLM systems were toolkits for several reasons. In the early days, scalable connectivity was not available or way too expensive for a standard collaboration approach. Engineering information, mostly design files, needed to be shared globally in an efficient manner, and the PLM backbone was often a centralized repository for CAD-data. Bill of Materials handling in PLM was often at a basic level, as either the ERP-system (mostly Aerospace/Defense) or home-grown developed BOM-systems(Automotive) were in place for manufacturing.

Depending on the business needs of the company, the target was too connect as much as possible engineering data sources to the PLM backbone – PLM originated from engineering and is still considered by many people as an engineering solution. For connectivity interfaces and integrations needed to be developed in a time that application integration frameworks were primitive and complicated. This made PLM implementations complex and expensive, so only the large automotive and aerospace/defense companies could afford to invest in such systems. And a lot of tuition fees spent to achieve results. Many of these environments are still operational as they became too risky to touch, as I described in my post: The PLM Migration Dilemma.

The birth of OOTB

Around the year 2000, there was the first development of OOTB PLM. There was Agile (later acquired by Oracle) focusing on the high-tech and medical industry. Instead of document management, they focused on the scenario from bringing the BOM from engineering to manufacturing based on a relatively fixed scenario – therefore fast to implement and fast to validate. The last point, in particular, is crucial in regulated medical environments.

At that time, I was working with SmarTeam on the development of templates for various industries, with a similar mindset. A predefined template would lead to faster implementations and therefore reducing the implementation costs. The challenge with SmarTeam, however, was that is was very easy to customize, based on Microsoft technology and wizards for data modeling and UI design.

This was not a benefit for OOTB-delivery as SmarTeam was implemented through Value Added Resellers, and their major revenue came from providing services to their customers. So it was easy to reprogram the concepts of the templates and use them as your unique selling points towards a customer. A similar situation is now happening with Aras – the primary implementation skills are at the implementing companies, and their revenue does not come from software (maintenance).

The result is that each implementer considers another implementer as a competitor and they are not willing to give up their IP to the software company.

SmarTeam resellers were not eager to deliver their IP back to SmarTeam to get it embedded in the product as it would reduce their unique selling points. I assume the same happens currently in the Aras channel – it might be called Open Source however probably it is only high-level infrastructure.

Around 2006 many of the main PLM-vendors had their various mid-market offerings, and I contributed at that time to the SmarTeam Engineering Express – a preconfigured solution that was rapid to implement if you wanted.

Although the SmarTeam Engineering Express was an excellent sales tool, the resellers that started to implement the software began to customize the environment as fast as possible in their own preferred manner. For two reasons: the customer most of the time had different current practices and secondly the money come from services. So why say No to a customer if you can say Yes?

OOTB and modules

Initially, for the leading PLM Vendors, their mid-market templates were not just aiming at the mid-market. All companies wanted to have a standardized PLM-system with as little as possible customizations. This meant for the PLM vendors that they had to package their functionality into modules, sometimes addressing industry-specific capabilities, sometimes areas of interfaces (CAD and ERP integrations) as a module or generic governance capabilities like portfolio management, project management, and change management.

The principles behind the modules were that they need to deliver data model capabilities combined with business logic/behavior. Otherwise, the value of the module would be not relevant. And this causes a challenge. The more business logic a module delivers, the more the company that implements the module needs to adapt to more generic practices. This requires business change management, people need to be motivated to work differently. And who is eager to make people work differently? Almost nobody,  as it is an intensive coaching job that cannot be done by the vendors (they sell software), often cannot be done by the implementers (they do not have the broad set of skills needed) or by the companies (they do not have the free resources for that). Precisely the principles behind the PLM Blame Game.

OOTB modularity advantages

The first advantage of modularity in the PLM software is that you only buy the software pieces that you really need. However, most companies do not see PLM as a journey, so they agree on a budget to start, and then every module that was not identified before becomes a cost issue. Main reason because the implementation teams focus on delivering capabilities at that stage, not at providing value-based metrics.

The second potential advantage of PLM modularity is the fact that these modules supposed to be complementary to the other modules as they should have been developed in the context of each other. In reality, this is not always the case. Yes, the modules fit nicely on a single PowerPoint slide, however, when it comes to reality, there are separate systems with a minimum of integration with the core. However, the advantage is that the PLM software provider now becomes responsible for upgradability or extendibility of the provided functionality, which is a serious point to consider.

The third advantage from the OOTB modular approach is that it forces the PLM vendor to invest in your industry and future needed capabilities, for example, digital twins, AR/VR, and model-based ways of working. Some skeptic people might say PLM vendors create problems to solve that do not exist yet, optimists might say they invest in imagining the future, which can only happen by trial-and-error. In a digital enterprise, it is: think big, start small, fail fast, and scale quickly.

OOTB modularity disadvantages

Most of the OOTB modularity disadvantages will be advantages in the toolkit approach, therefore discussed in the next paragraph. One downside from the OOTB modular approach is the disconnect between the people developing the modules and the implementers in the field. Often modules are developed based on some leading customer experiences (the big ones), where the majority of usage in the field is targeting smaller companies where people have multiple roles, the typical SMB approach. SMB implementations are often not visible at the PLM Vendor R&D level as they are hidden through the Value Added Reseller network and/or usually too small to become apparent.

Toolkit advantages

The most significant advantage of a PLM toolkit approach is that the implementation can be a journey. Starting with a clear business need, for example in modern PLM, create a digital thread and then once this is achieved dive deeper in areas of the lifecycle that require improvement. And increased functionality is only linked to the number of users, not to extra costs for a new module.

However, if the development of additional functionality becomes massive, you have the risk that low license costs are nullified by development costs.

The second advantage of a PLM toolkit approach is that the implementer and users will have a better relationship in delivering capabilities and therefore, a higher chance of acceptance. The implementer builds what the customer is asking for.

However, as Henry Ford said, if I would ask my customers what they wanted, they would ask for faster horses.

Toolkit considerations

There are several points where a PLM toolkit can be an advantage but also a disadvantage, very much depending on various characteristics of your company and your implementation team. Let’s review some of them:

Innovative: a toolkit does not provide an innovative way of working immediately. The toolkit can have an infrastructure to deliver innovative capabilities, even as small demonstrations, the implementation, and methodology to implement this innovative way of working needs to come from either your company’s resources or your implementer’s skills.

Uniqueness: with a toolkit approach, you can build a unique PLM infrastructure that makes you more competitive than the other. Don’t share your IP and best practices to be more competitive. This approach can be valid if you truly have a competing plan here. Otherwise, the risk might be you are creating a legacy for your company that will slow you down later in time.

Performance: this is a crucial topic if you want to scale your solution to the enterprise level. I spent a lot of time in the past analyzing and supporting SmarTeam implementers and template developers on their journey to optimize their solutions. Choosing the right algorithms, the right data modeling choices are crucial.

Sometimes I came into a situation where the customer blamed SmarTeam because customizations were possible – you can read about this example in an old LinkedIn post: the importance of a PLM data model

Experience: When you plan to implement PLM “big” with a toolkit approach, experience becomes crucial as initial design decisions and scope are significant for future extensions and maintainability. Beautiful implementations can become a burden after five years as design decisions were not documented or analyzed. Having experience or an experienced partner/coach can help you in these situations. In general, it is sporadic for a company to have internally experienced PLM implementers as it is not their core business to implement PLM. Experienced PLM implementers vary from size and skills – make the right choice.

 

Conclusion

After writing this post, I still cannot write a final verdict from my side what is the best approach. Personally, I like the PLM toolkit approach as I have been working in the PLM domain for twenty years seeing and experiencing good and best practices. The OOTB-box approach represents many of these best practices and therefore are a safe path to follow. The undecisive points are who are the people involved and what is your business model. It needs to be an end-to-end coherent approach, no matter which option you choose.

 

 

 

In this post, I will explain the story behind my presentation at PI PLMx London. You can read my review of the event here: “The weekend after ……” and you can find my slides on SlideShare: HERE.

For me, this presentation is a conclusion of a thought process and collection of built-up experiences in the past three to  five years, related to the challenges digital transformation is creating for PLM and what makes it hard to go through compared to other enterprise business domains.  So here we go:

Digital transformation or disruption?

Slide 2 (top image) until 5 are dealing with the common challenges of business transformation. In nature, the transformation from a Caterpillar (old linear business) to a Butterfly (modern, agile, flexible) has the cocoon stage, where the transformation happens. In business unfortunate companies cannot afford a cocoon phase, it needs to be a parallel change.

Human beings are not good at change (slide 3 & 4), and the risk is that a new technology or a new business model will disrupt your business if you are too confident – see examples from the past. The disruption theory introduced by Clayton Christensen in his book, the Innovators Dilemma is an excellent example of how this can happen.  Some of my thoughts are in The Innovator’s dilemma and generation change (2015)

Although I know some PLM vendors consider themselves as disruptor, I give them no chance in the PLM domain. The main reason: The existing PLM systems are so closely tied to the data they manage, that switching from one PLM system to a more modern PLM system does not pay off.  The data models are so diverse that it is better to stay with the existing environment.

What is clear for modern digital businesses is that if you could start from scratch or with almost no legacy you can move faster forward than the rest. But only if supported by a strong leadership , a(understandable) vision and relentless execution.

The impression of evolution

Marc Halpern’s slide presented at PDT 2015 is one of my favorite slides, as it maps business maturity to various characteristics of an organization, including the technologies used.

 

Slide 7 till 18 are zooming in on the terms Coordinated and Connected and the implications it has for data, people and business. I have written about Coordinated and Connected recently: Coordinated or Connected (2018)

A coordinated approach: Delivering the right information at the right moment in the proper context is what current PLM implementations try to achieve. Allowing people to use their own tools/systems as long as they deliver at the right moment their information (documents/files) as part of the lifecycle/delivery process. Very linear and not too complicated to implement you would expect. However it is difficult ! Here we already see the challenge of just aligning a company to implement a horizontal flow of data. Usability of the PLM backbone and optimized silo thinking are the main inhibitors.

In a connected approach: Providing actual information for anyone connected in any context the slide on the left shows the mental picture we need to have for a digital enterprise. Information coming from various platforms needs to be shareable and connected in real-time, leading, in particular for PLM, to a switch from document-based deliverables to models and parameters that are connected.

Slide 15 has examples of some models.  A data-driven approach creates different responsibilities as it is not about ownership anymore but about accountability.

The image above gives my PLM-twisted vision of which are the five core platforms for an enterprise.  The number FIVE is interesting as David Sherburne just published his Five Platforms that Enable Digital Transformation and in 2016 Gartner identified Five domains for the digital platform .- more IT-twisted ? But remember the purpose of digital transformation is: FIVE!

From Coordinated to Connected is Digital Transformation

Slide 19 till 27 further elaborate on the fact that for PLM there is no evolutionary approach possible, going from a Coordinated technology towards a Connected technology.

For three reasons:  different type of data (document vs. database elements), different people (working in a connected environment requires modern digital skills) and different processes (the standard methods for mechanical-oriented PLM practices do not match processes needed to deliver systems (hardware & software) with an incremental delivery process).

Due to the incompatibility of the data, more and more companies discover that a single PLM-instance cannot support both modes – staying with your existing document-oriented PLM-system does not give the capabilities needed for a model-driven approach. Migrating the data from a traditional PLM-environment towards a modern data-driven environment does not bring any value. The majority of the coordinated data is not complete and with the right quality to use a data-driven environment. Note: in  a data-driven environment you do not have people interpreting the data – the data should be correct for automation / algorithms.

The overlay approach, mentioned several times in various PLM-blogs, is an intermediate solution. It provides traceability and visibility between different data sources (PLM, ALM, ERP, SCM, …). However it does not make the information in these systems better accessible.

So the ultimate conclusion is: You need both approaches, and you need to learn to work in a hybrid environment !

What can various stakeholders do?

For the management of your company, it is crucial they understand the full impact of digital transformation. It is not about a sexy customer website, a service platform or Virtual Reality/Augmented Reality case for the shop floor or services. When these capabilities are created disconnected from the source (PLM), they will deliver inconsistencies in the long-term. The new digital baby becomes another silo in the organization. Real digital transformation comes from an end-to-end vision and implementation.  The result of this end-to-end vision will be the understanding that there is a duality in data, in particular for the PLM domain.

Besides the technicalities, when going through a digital transformation, it is crucial for the management to share their vision in a way it becomes a motivational story, a myth, for all employees. As Yuval Harari, writer of the book Sapiens,  suggested, we (Home Sapiens) need an abstract story, a myth to align a larger group of people to achieve a common abstract goal. I discussed this topic in my posts: PLM as a myth? (2017)  and PLM – measurable or a myth?

Finally, the beauty of new digital businesses is that they are connected and can be monitored in real-time. That implies you can check the results continuously and adjust – scale of fail!

Consultants and strategists in a company should also take the responsibility, to educate the management and when advising on less transformational steps, like efficiency improvements: Make sure you learn and understand model-based approaches and push for data governance initiatives. This will at least narrow the gap between coordinated and connected environments.

This was about strategy – now about execution:

For PLM vendors and implementers, understanding the incompatibility of data between current PLM practices – coordinated and connected – it will lead to different business models. Where traditionally the new PLM vendor started first with a rip-and-replace of the earlier environment – no added value – now it is about starting a new parallel environment.  This implies no more big replacement deals, but more a long-term. strategic and parallel journey.  For PLM vendors it is crucial that being able to offer to these modes in parallel will allow them to keep up their customer base and grow. If they would choose for coordinated or connected only it is for sure a competitor will work in parallel.

For PLM users, an organization should understand that they are the most valuable resources, realizing these people cannot make a drastic change in their behavior. People will adapt within their capabilities but do not expect a person who grew up in the traditional ways of working (linear / analogue) to become a successful worker in the new mode (agile / digital). Their value lies in transferring their skills and coaching new employees but do not let them work in two modes. And when it comes to education: permanent education is crucial and should be scheduled – it is not about one or two trainings per year – if the perfect training would exist, why do students go to school for several years ? Why not give them the perfect PowerPoint twice a year?

Conclusions

I believe after three years of blogging about this theme I have made my point. Let’s observe and learn from what is happening in the field – I remain curious and focused about proof points and new insights. This year I hope to share with you new ideas related to digital practices in all industries, of course all associated with the human side of what we once started to call PLM.

Note: Oleg Shilovitsky just published an interesting post this weekend: Why complexity is killing PLM and what are future trajectories and opportunities? Enough food for discussion. One point: The fact that consumers want simplicity does not mean PLM will become simple – working in the context of other information is the challenge – it is human behavior – team players are good in anticipating – big egos are not. To be continued…….

 

 

 

 

 

 

 

 

 

This is the moment of the year to switch-off from the details. No more talking and writing about digital transformation or model-based approaches. It is time to sit back and relax. Two years ago I shared the PLM Songbook, now it is time to see one or more movies. Here are my favorite top five PLM movies:

Bruce Almighty

Bruce Nolan, an engineer in Buffalo, N.Y., is discontented with almost everything in the company despite his popularity and the love of his draftswoman Grace. At the end of the worst day of his life, Bruce angrily ridicules and rages against PLM and PLM responds. PLM appears in human form and, endowing Bruce with divine powers op collaboration, challenges Bruce to take on the big job to see if he can do it any better.

A movie that makes you modest and you realize there is more than your small ecosystem.

 

The good, the bad and the ugly

Blondie (The Good PLM consultant) is a professional who is out trying to earn a few dollars. Angel Eyes (The Bad PLM Vendor) is a PLM salesman who always commits to a task and sees it through, as long as he is paid to do so. And Tuco (The Ugly PLM Implementer) is a wanted outlaw trying to take care of his own hide. Tuco and Blondie share a partnership together making money off Tuco’s bounty, but when Blondie unties the partnership, Tuco tries to hunt down Blondie. When Blondie and Tuco come across a PLM implementation loaded with dead bodies, they soon learn from the only survivor (Bill Carson – the PLM admin) that he and a few other men have buried a stash of value on a file server. Unfortunately, Carson dies, and Tuco only finds out the name of the file server, while Blondie finds out the name on the hard disk. Now the two must keep each other alive in order to find the value. Angel Eyes (who had been looking for Bill Carson) discovers that Tuco and Blondie met with Carson and knows they know the location of the value. All he needs is for the two to ..

A movie that makes you realize that it is a challenging journey to find the value out of PLM. It is not only about execution – but it is also about all the politics of people involved – and there are good, bad and ugly people on a PLM journey.

The Grump

The Grump is a draftsman in Finland from the past. A man who knows that everything used to be so much better in the old days. Pretty much everything that’s been done after 1953 has always managed to ruin The Grump’s day. Our story unfolds The Grump opens a 3D Model on his computer, hurting his brain. He has to spend a weekend in Helsinki to attend a model-based therapy. Then the drama unfolds …….

A movie that makes you realize that progress and innovation do not come from grumps. In every environment when you want to do a change of the status quo, grumps will appear. With the exciting Finish atmosphere, a perfect film for Christmas.

Deliverance

The Cahulawassee River Valley company in Northern Georgia is one of the last analog companies in the state, which will soon change with the imminent implementation of a PLM system in the company, breaking down silos everywhere. As such, four Atlanta city slickers, alpha male Lewis Medlock, generally even-keeled Ed Gentry, slightly condescending Bobby Trippe, and wide-eyed Drew Ballinger decide to implement PLM in one trip, with only Lewis and Ed having experience in CAD. They know going in that the area is ethnoculturally homogeneous and isolated, but don’t understand the full extent of such until they arrive and see what they believe is the result of generations of inbreeding. Their relatively peaceful trip takes a turn for the worse when half way through they encounter a couple of hillbilly moonshiners. That encounter not only makes the four battle their way out of the PLM project intact and alive but threatens the relationships of the four as they do.

This movie, from 1972, makes you realize that in the early days of PLM starting a big-bang implementation journey into an area that is not ready for it, can be deadly, for your career and friendship. Not suitable for small children!

Diamonds Are Forever or Tron (legacy)

James Bond’s mission is to find out who has been drawing diamonds, which are appearing on blogs. He adopts another identity in the form of Don Farr. He joins up with CIMdata and acts as if he is developing diamonds, but everyone is hungry for these diamonds. He also has to avoid Mr. Brouwer and Mr. Kidd, the dangerous couple who do not leave anyone in their way when it comes to model-based. And Ernst Stavro Blofeld isn’t out of the question. He may have changed his looks, but is he linked with the V-shape? And if he is, can Bond finally defeat his ultimate enemy?

Sam Flynn, the tech-savvy 27-year-old son of Kevin Flynn, looks into his father’s disappearance and finds himself pulled into the same world of virtual twins and augmented reality where his father has been living for 20 years. Along with Kevin’s loyal confidant Quorra, father and son embark on a life-and-death journey across a visually-stunning cyber universe that has become far more advanced and exceedingly dangerous. Meanwhile, the malevolent program IoT, who dominates the digital world, plans to invade the real world and will stop at nothing to prevent their escape

I could not decide about number five. The future is bright with Boeing’s new representation of Systems Engineering, see my post on CIMdata’s PLM Europe roadmap event where Don Farr presented his diamond(s). However, the future is also becoming a mix of real with virtual and here Tron (legacy) will help my readers to understand the beauty of a mixed virtual and real world. You can decide – or send me your favorite PLM movies.

Note: All movie reviews are based on IMBd.com story lines, and I thank the authors of these story lines for their contribution and hope they agree with the PLM-related twist. Click on the image to find the full details and original review.

Conclusion

2018 has been an exciting year with a lot of buzzwords combined with the reality that the current PLM approach is incompatible with the future. How we can address this issue more in 2019 – first at PI PLMx 2019 in London (be there – last chance to meet people in the UK when they are still Europeans and share/discuss plans for the upcoming year)

Wishing you all the best during the break and a happy and prosperous 2019

 

If your are reading blogs related to PLM, I am sure you have seen a blog post from Stephen Porter (Zero Wait State), for example: The PLM state  the walking dead – PLM projects that never end.

Like Stephen, I am often triggered by an inspiring book, a touching movie or a particular song combined with my PLM-twisted brain I relate the content to PLM (there is no official name for this abnormality yet).

When driving home last week, I was listening to Phil Collins – In the air tonight

As I was just coming back from a discussion around PLM tools, BOMs and possible PLM expansion strategies in a company with customers and resellers, my twisted brain was thinking about two PLM related topics that were in the air tonight (at least for me).

Granularity vs. Integration: Suites vs. Best-in-class PLM

You must have noticed it, and if not, now you are aware:  Jim Brown and Chad Jackson started a PLM duel discussion platform at Engineering.com to bring PLM related topics to the table: TECH4PD.  Watch them argue and I hope with your feedback and the feedback of the PLM community, it will  help us to make up your mind.

The topic they discussed in their first session was about two different approaches you can have for PLM. Either start from a best in class PLM platform or build your PLM support by using dedicated applications and integrate them.

tech4pd

This is to my opinion one of the fundamental PLM topics to discuss.  And if I would have to vote (as  Jim and Chad ask you to do so), I would first vote for Chad (integration of software) and after a second thought, vote for Jim (best in class PLM). So see my problem.

If I relate the discussion to my experiences with  different companies, I realized that probably both answers are correct. In case you are an OEM you likely would benefit from a best in class PLM platform, as PLM systems aim to cover and integrate all data through the product lifecycle in a single system, single data model, etc.. So a good PLM platform would have the lowest cost of ownership in the long term.  OEMs are by definition not the smallest companies and in general have the highest need for global coverage.

But not every company is an OEM. Many mid-market companies are specific suppliers,serving different OEMs and although they also develop products, it is in a different context of market delivery. There is a need to be flexible, as their products used by OEMs might become obsolete in the near term, they need to be more flexible, reactive. and the best in class companies innovate and are proactive. For that reason they do want to invest in a best in class PLM system, which somehow brings some rigidness , but keep on optimizing these areas where improvement is needed in their organization, instead of changing the organization.

I believe this question will remain in the air until we get a clear split between these two types of PLM. There is a trend splitting classic PLM (OEM oriented) and new upcoming PLM solutions. Till that time, we will be confused by the two approaches. It is a typical PLM disease and the reason you do not experience the same discussion for ERP is obvious. ERP is much more a linear process that both for the OEMs and mid-market companies is aiming to manufacture products or goods at a single location. The differentiation is in global manufacturing. Where do you manufacture your products ?  Here the OEMs might have a bigger challenge. Global manufacturing is a PLM challenge too, which is in the air.

Where is the MBOM ?

This is the topic most visited in my blog and I am preparing a session with the MBOM as theme combined with PLM  for the upcoming PLM Innovation US conference end of October in Atlanta. I am not going to disclose all  the content here, but I will give you some thoughts that are in the air.

Companies historically manage their BOM in ERP, but as a result of globalization they now need to manage their manufacturing BOM at different locations. But each location has its own ERP and a local (M)BOM. What to do ?

This is in the air:

intheair
  • How do you keep the relation with the
    original engineering intent, the product information and the various local
    activities ?
  • Is the ERP system still the place to build
    the MBOM ?
  • Do you need an EBOM and MBOM in PLM ?
  • Cannot we have a single BOM ?
  • What about search technology ?
  • …..

I hope you will participate in both discussions that are the air, either by commenting to this blog, through Tech4PD, your blog (Oleg  ? 😉 ) or your participation at PLM Innovation US.

Looking forward to discuss with you about what you believe is in the air tonight.

image

Conclusion (as usual): It is a busy time – we are heading towards the end of the year, which for some reason is a deadline for many companies. So no long thought processes this time, just what is in the air.

PLM_innoIn my last post PLM kills Innovation or not, I tried to provoke PLM vendors to respond to my claim that PLM has too much a focus on structuring data (and therefore removing freedom) claiming it blocks innovation as everyone believes innovation requires freedom and flexibility. This statement is often heard from startups claiming implementing any type of management would kill their competitive advantage. Still in the PLM marketing world everyone mentions PLM and Innovation as Siamese twins, but no one explains explicitly why they are connected.

So not too many reactions from vendors but some interesting comments from others to this post. Andrew Mack mentions that we should not confuse Innovation and Invention as for native English speakers there is a clear distinction. I agree with him however as most of my blog readers are not native English speakers I will explain the difference in this post.

For me it is clear PLM supports Innovation in three different manners, which I will explain here in a logical order – see the conclusion for the order of profit it will bring:

Invention Discovery

Invention, the creation of a new idea that might be the golden egg for the future of a company. It is often the result of one or more individuals- not something a systematic approach or system will bring automatically. If you look how big companies handle with invention, you see that often they do not manage it. They look around the world for , or sometimes get approached by, startups that have a concept that fits to their portfolio and they buy the company and concept.

This is of course a very disconnected way of invention, but from the other hand, the drive from many startups is to work day and night to develop a concept and ultimately sell the company for a good price. Compare it to the big soccer companies that have only money (currently mainly Russian or Arabic) but no own youth development plan to raise new talents. So it is a common way for companies to acquire invention (and promote innovation).

But I believe there is also a way companies can stimulate invention by implementing the modern way of PLM (PLM 2.0 – see my posts on that) and not use PLM as an extended PDM as I described in PLM What is the target. When a company has implemented PLM in a PLM 2.0 approach, it means there is a full visibility and connection of all product data, customer demands (through sales) and experiences (through service) for an R&D department to innovate.

Why this does not happen so much?

plmBecause inside most companies, people do not have an approach or drive for sharing data through the whole product lifecycle. Every department is optimizing themselves, not taking into account the value and overall company needs as they are not measured on that. In order to support invention PLM can provide an R&D department and individuals with all related market and customer information in order to create relevant inventions. So PLM helps here on understanding the areas of invention and probably the most unexplored area of PLM

 

Support selection of the right invention

 

funnelThe second area where PLM contributes to innovation is assisting companies to select the right opportunities that can be the next big opportunity for these companies. In case you have many opportunities, which one would you select and invest in ? As usually it unaffordable to invest in every opportunity usually and knowing at this stage you are not sure if a particular opportunity will lead to a profitable new product, you need a process and tool to select the right ones.

 

Here comes portfolio management as a functionality that allows companies to have an overview of all running initiatives and through reporting on key performance indicators (KPIs) being able to select the opportunities where to invest.

 

 

Support New Product Introduction

Once you have selected an opportunity and also as part of the portfolio management process you feel secure, there is the third step. How to bring this opportunity to the market as fast as possible, with the right quality and the right manufacturing definition? As being first on the market gives you market share and premium pricing.

Time2MarketAlso as changes in the early manufacturing stage and later during the go to market phase are extremely costly, it is important to bring a new product to the market as fast as possible in the right quality, avoiding changes when the new product is in the market. This is the area where PLM contributes the most. Allowing R&D organizations to work on their virtual product definition and perform simulations, design and customer verifications. Also anticipate and resolve compliancy and sourcing issues in the early stages of the product development. All this assures a reduction in the amount of iterations before a new product is ready to ´ hit´ the market.

iterations

 

A famous PLM one-liner is for PLM is: PLM – doing it right the first time, it refers more to the fact that a product introduction process is done only once and with the right quality. It does not mean iterations to improve or change the product scope are not needed.

 

Improvement cycles are necessary to bring a product to the market. But as they are done in the virtual world, the R&D department has the option to evaluate several alternatives (virtually), work and improve them till the best option is selected for the market saving cost for late design changes or errors to be solved. And even when the product is defined, PLM can help by defining the right generic manufacturing process and make it available for the local manufacturing organizations (where is the MBOM ?)

 

Conclusion

PLM does not kill innovation and although the PLM Vendor marketing is not very explicit, there are three areas where PLM supports Innovation. In a (subjective) order of priority I would say:

· New Product Introduction – bringing the highest revenue advantages for a selected invention

· Invention discovery – by providing R&D a 360 view of their customers and market landscape enable inventions to happen in your company

· Portfolio Management – to assist in selecting the right opportunities to focus

 

PLM_inno

Your thoughts ?

KILL_INNOThis time a little provocative title, which I hope will give some feedback from those who claim PLM brings INNOVATION. In one of my earlier posts, I talked about PLM – what is the target and also acted as an advocate for innovation.

Now after hearing PLM and INNOVATION mentioned the past two weeks everywhere as a logical combination, I started to question myself is this is just marketing ?
In this post I will act as the advocate of the devil.

Two weeks ago I attended the PLM Innovation congress, where first of all these two words were accepted by the audience as obvious linked together PLM INNOVATION. I participated in three sessions where INNOVATION was the topic. Charles Gagnon from Hydro Quebec talked about open innovation, but mentioned Hydro Quebec did not use any tool to manage INNOVATION so also not PLM. Peter Fassbender talked about the innovative approach to connect to the outside world using crowd sourcing and social media, but again no mentioning about PLM. Finally Christian Verstraete held an appeal for INNOVATION, urging everyone to think outside the box (did he mean outside the PLM box? Anyway no mentioning of PLM contribution)

Autodesk´s CEO  Carl Bass talked about The New Rules of INNOVATION at TEDx  and in the first part of his speech he is making the same statement that I would make related to PLM – see video:

Innovation–the work of the individual

Some quotes from his speech:

“Innovation is fundamentally not a corporate phenomena. Innovation involves taking risks and involves breaking the rules. And companies aren’t particularly good at that. In fact let´s say that it is just the opposite: companies are good in making rules and minimizing risk”

and quoting the author of The Innovator´s Dilemma Clayton Christensen:

“The lack of innovation is not the failure of companies but rather the result of prudent and sound management”

Autodesk

A week later Autodesk PLM 360 is launched. Again a PLM system that is bringing rules and structure, but apparently so far not affecting INNOVATION as we can see in one of the videos that came with the launch.

What the experts say.

Unfortunate the above video is only a teaser to get to the Autodesk Facebook page where, if you like them, you will be rewarded with access. A modern way of marketing: Only if you like us, we will tell you what we like.

Sanjeev Pal believes PLM is a business strategy that helps companies to reach their INNOVATION goals. Tom Grant starts with: “INNOVATION is really the word to focus on. At the heart of PLM is INNOVATION”. Later he states that probably one of the problems in PLM is the M (the Management part) that misleads the mind shifting in the wrong direction (against user acceptance and involvement’) and he prefers to call it more enablement instead of management. Somehow the conclusion is that PLM supports INNOVATION by bringing products faster to market. Does this mean PLM is the vehicle of bringing new innovations to the market ? Instead of creating a platform for INNOVATION?

STATS: Autodesk PLM + Innovation: 199.000 hits on Google

Once you have struggled like me to find the roundtable discussion and its content,  let´s look at other PLM vendors in alphabetical order:

Aras

Aras INNOVATOR– the word INNOVATOR it is already in the name but when you read more clearly what is stated at the Aras website, you see the word INNOVATOR is more targeting themselves (the software / the delivery model) instead of customer oriented INNOVATION. There the message is more about streamlining and connecting people and businesses (efficiency / collaboration).  So not much INNOVATION here related to PLM is my conclusion

STATS: Aras + Innovation: 11.200.000  hits on Google

Dassault Systèmes

When looking for Dassault Systèmes and INNOVATION I found an interesting statement on their website.

Dassault Systems launched its Passion for Innovation program in 2005. The program is based on a simple guiding principle: it is often the case that outstanding ideas do not come to fruition due to lack of appropriate resources. At Dassault Systèmes, all employees are free to install CATIA on their workstation. The idea of the program is to provide this opportunity to everyone

Is INNOVATION a result of the CAD tool ? not related to PLM ? I am sure there must a better story – but where is it ? There is a lot of talk about innovation, but related to PLM ?

STATS: Dassault Systèmes+ Innovation 1.340.000 hits on Google

Oracle Agile PLM

Oracle Agile PLM has a clear statement how they support INNOVATION:

Accelerate innovation through ideation management and collaboration, product portfolio management and analytics, data consolidation and cleansing, and a rich enterprise product record

I am not sure from this statement if we know the source wrote it. Marketing or a serious attempt to describe how Agile supports INNOVATION. I would love to learn a refined statement here that I understand.

STATS: Oracle PLM  + Innovation: 150.000 hits on Google

PTC

PTC does not give a direct association with INNOVATION. When I searched for PTC and INNOVATION the first suggestion was:  Did you mean HTC INNOVATION ?

Also when searching the PTC website, INNOVATION was hard to find. Interesting I noticed that the first main tab on the left was Discover our software capabilities. I was expecting that PTC like most PLM vendors would start from the business and not from the products.

STATS: PTC + Innovation 3.450.000 hits on Google

SAP PLM

On the SAP PLM website I found a tab called Innovation Management and here SAP PLM gave a clear explanation which practices contribute to INNOVATION Management. SAP mentions here:

  • Strategy and planning
  • Managing innovative ideas
  • Program and project management
  • Portfolio management

As I expected SAP will not hint in any direction towards CAD tools and their focus is mainly on the management side. I would love to learn the part of managing innovative ideas as this is the challenging part. Ideas and management ?

STATS: SAP PLM + Innovation 131.000 hits on Google

Siemens PLM

On the Siemens PLM website you have to search for INNOVATION and when you do a search, you are mainly directed to blog articles, The word-cloud next to the blog did not show the word INNOVATION in bold, showing it is not a common used word. Digging deeper, I found a blog post related to an Innovation Leadership Summit, which suggests again there is a relation between PLM and INNOVATION. The closest match I found here was:

The HBR survey found that enterprises rely on PLM and IT to manage all this complexity, including new sustainability and regulatory requirements. PLM solutions  track the ideation process, monitor progress, identify laggard projects, and facilitate collaboration. Leading organizations leverage PLM to improve new product development processes and outcomes

That´s all. So my conclusion here is that also Siemens PLM not naturally connects PLM and Innovation.

STATS: Siemens PLM + Innovation 901.000 hits on Google

Common rumors

InnovDilemma

Without mentioning names, I hear stories from PLM implementations (or should I call it extended PDM implementations) that have created such a massive lock-in on the current state of the company, that changing the processes or innovating is almost a mission impossible. Exact what Clayton Christensen mentions in his Innovator´s Dilemma.

So having played the devil´s advocate role, I hope I made my statement that there is no real relation between PLM and INNOVATION despite it seems these two words are mentioned together as if they are linked.

Call for action

Therefore I challenge all vendors and companies that have a proven relation between PLM and INNOVATION to come to this debate and make their statement.

Where are the mythbusters that will crack the statement:

PLM has nothing to do with INNOVATION?

Looking forward to your responses.

mythbusters

PLM_inno_2012

I am just back from an exciting PLM Innovation 2012 conference. With a full program and around 250 participants, it was two intensive days of PLM interaction.

What I liked the most is that the majority of the audience was focusing on PLM business related topics. The mood of PLM has changed.

In this post, I will give an impression of the event, how I experienced it without going into the details of each session.

Several interesting sessions were in parallel so I could not attend them all, but MarketKey, the organizer of the conference confirmed that all presentations are filmed and will become available on-line for participants. So more excitement to come.

First my overall impression: Compared to last year’s conference there was more a focus on the PLM business issues and less on PLM IT or architecture issues (or was it my perception ?)

DAY 1

Gerard Litjens (CIMdata Director European Operations) opened the conference as CIMdata co-hosted the conference. In his overview he started with CIMdata’s PLM definition – PLM is a strategic business approach. (Everyone has his own definition as Oleg noticed too). Next he presented what CIMdata sees as the hottest topics. No surprises here: Extension from PLM to new industries, extending PDM towards PLM, Integration of Social Media, Cloud, Open Source, Enterprise integration and compliance.

abbNext speaker was Thomas Schmidt (Vice President, Head of Operational Excellence and IS – ABB’s Power Products Division) was challenging the audience with his key note speech: PLM: Necessary but not sufficient. With this title it seemed that the force was against him (thanks Oleg for sharing).

Thomas explained that the challenge of ABB is being a global company and at the same time acting as a ‘local’ company everywhere around the world. In this perspective he placed PLM as part of a bigger framework to support operational excellence and presented some major benefits from a platform approach. I believe the Q&A session was an excellent part to connect Thomas’s initial statements to the PLM focused audience.

Marc Halpern from Gartner gave his vision on PLM. Also Marc started with the Gartner definition of PLM, where they characterized PLM as a discipline. Gartner identified the following 5 major trends: Software everywhere in products, usage of social media for product development and innovation, using analytics tools to support the whole product lifecycle – after sales, service, connecting to the customer. Opportunities for existing products to deliver them through services (media content, transportation)

PLM_profNext I attended the Autodesk session, a PLM journey using the cloud, where I was eager to learn their approach towards PLM. Autodesk (Mike Lieberman) let Linda Maepa, COO from Electron Vault in the USA explain the benefits of the Autodesk PLM 360 solution. Electron Vault, a young, high-tech company, has implemented the solution within 2 weeks. And here I got disconnected . Also when the suggestion was raised that you do not need time to specify the requirements for the system (old-fashioned stuff),

I suddenly got into a trance and saw a TV advert from a new washing power, with numerous features (program management, new product introduction, …..) that was washing whiter than all the others and a happy woman telling it to the world. I believe if Autodesk wants to be considered as serious in the PLM world it should also work with existing customers and managing the change in these organizations. Usually it takes already more than two weeks to get them aligned and agree on the requirements. Unfortunate I did not have time during the breaks to meet Autodesk at their booth as I would love to continue the discussion about reality as my experience and focus is on mid-market companies. Waiting for a next opportunity.

After Autodesk, I presented in my session what are the main drivers for making the case for PLM. I also started with my favorite PLM definition (a collection of best practices – 2PLM) and explained that PLM starts with the management vision and targets for the future. Is it about efficiency, quality, time to market, knowledge capture or a more challenging task: creating the platform for innovation?

hydroQNext I followed the Energy tracks, where I listened to Charles Gagnon from Hydro Quebec, who gave an interesting lecture called: Implementing Open Innovation and Co-Development.

At first glance this is a sensitive topic. When you innovate it is all about creating new intellectual property, and the fear that when working with partners the IP might be out of the company, Charles explained how this process of collaborative innovation was started and monitored. At the end he reported they measured a significant gain in R&D value perceived when working with external partners. And they did not use a PLM system to manage Innovation (to be investigated how they could survive)

After the lunch I continued with Jonas Hagner from WinWinD, a young manufacturer of windmills that are targeted to operate in extreme climate conditions ( a niche market). They are both implementing PLM and ERP in parallel and they did not have to suffer from years of ERP before PLM and therefore could have a more balanced discussion around part information availability / part number and more. Still I believe they have the challenge to connect in an efficient manner the services of the windmills back to their R&D organization, to do a full PLM circle.

Karer consulting together with Siemens Energy presented the case how they have designed and starting the implement the interface between their PLM system (Teamcenter) and ERP system (SAP). What was disappointing to see was that the interface between Teamcenter and SAP was relative complex (bi-directional with engineering activities in both sides) . Almost 1½ years of development of this interface and one of the main reasons, because SAP was first and they start the engineering order in SAP.

ebom_mbom_problem

Apparently 2 years later Siemens Energy could not implement a clear distinct separation between PLM and ERP anymore and will not have to live with this complex interface. In the past I have written several times about this complexity that companies seem to accept due to political or historical reasons. Sad story for PLM – Where is the MBOM ?.

ebom_mbom_plm

The day finished with a closing keynote from Peter Bilello, explaining how a successful PLM implementation could look like. Many wise statements that everyone should follow in case you want to come to a successful implementation (and define correctly what success is)

Thanks to Autodesk we had a nice evening reception, discussion and evaluating with peers the first day.

Day 2

mioDay 2 started for me with an interesting lecture from Peter Fassbender, Head Design Center Fiat Latin America, describing how in Brazil the Fiat Mio experiment used modern social media techniques, like crowdsourcing, communities and user involvement to guide the innovation and development of a potential car. A unique experiment demonstrating that this type of projects are influence the brand reputation positively (if managed correct) and for me an example of what PLM could bring if R&D is connected to the outside world.

Christian Verstraete Chief Technologist – Cloud Strategy from HP gave an inspiring session about the open frontiers of innovation. The speed of business in the past 30 years has increased dramatically (you need to be from an older generation to be aware of this – the definition of response time has changed due to new technologies) Christian pushed everyone to think Out of the Box and to be innovative, which made me wonder how long will companies in the future build standard boring products. Will keep on innovating in this amazing pace as we did in the past 30 years ?

lf1Graeme Hackland, IT/IS director from the UK based Lotus F1 team presented the challenges a F1 team has to face every year due to changing regulations. I visited Lotus F1 last year and was impressed by the fact that over 500 engineers are all working around one carper year to optimize the car mainly for aerodynamics, but next to assure it performs during the years. Thousands of short interactions, changes to be implemented a.s.a.p. challenge the organization to collaborate in an optimum manner. And of course this is where PLM contributes. All the F1 fans could continue to dream and listen to Graeme’s stories but Jeremie Labbe from Processia brought us back to earth by explaining how Processia assisted Lotus F1 in a PLM value assessment as a next step.

Meanwhile I had some side discussions on various PLM topics and went back to the sessions, seeing how David Sherburne, Director of Global R&D Effectiveness from Carestream Health presented his case (open source PLM) and his analysis why an open source PLM model (based on Aras) is very appealing in their case. Indeed the business value perceived and significant lower operational costs for the software are appealing for his organization and for sure will influence the other PLM vendors in their pricing model.

Pierfrancesco Manenti, from IDC Manufacturing Insights gave a clear presentation indicating the future directions for PLM: managing operational complexity, not product complexity. As you could expect from IDC Manufacturing Insights all was well based on surveys in the manufacturing industry and clearly indicating that there is still a lot to do for companies to efficient share and work around a common product development and operational platform. New technologies (the four IT forces: mobility, cloud, social business and big data analytics) will help them to improve.

eso

The closing keynote came from Jason Spyromilio , who was director of the European Southern Observatory’s Very Large Telescope (http://www.eso.org) and he gave us the insights in designing (and building) the biggest eye on the sky. Precision challenges for such a huge telescope mirror, being built in the high mountains of Chili in an earthquake sensitive area demonstrate that all participants are required to contribute their IQ in order to realize such a challenge.

Conclusion: This PLM Innovation 2012 event doubled the 2011 event from a year ago in all dimensions. Thanks to the sponsors, the organization and high quality lectures, I expect next year we could double again – in participants, in content and innovation. It shows PLM is alive. But comming back to the title of this post: I saw some interesting innovation concepts – now how to enabale them with PLM ? 

Note: looking at the pictures in this postyou will notice PLM is everywhere. I published this post on February 29th – a unique day which happens only every 4 years. In May this year my blog will be 4 years old.

dontmissLast week I started my final preparation for the PLM Innovation Congress 2012 on February 22nd and 23rd in Munich, where I will speak about Making the Case for PLM. Looking forward for two intensive days of knowledge sharing and discussion

The question came to my mind that when you make the case for PLM, you also must be clear about what you mean by PLM. And here I started to struggle a little. I have my perception of PLM, but I am also aware everyone has a different perception about the meaning of PLM.

cmpicI wrote about it last year, triggered by a question in the CMPIC group (configuration management) on LinkedIn. The question was Aren’t CM and PLM the same thing ? There was a firm belief from some of the members that PLM was the IT-platform to implement CM.

PLM_PDM_CAD_networkA few days ago Inge Craninckx posted a question in the PDM PLM CAD network group about the definition of PLM based on a statement from the PLMIG. In short:

“PDM is the IT platform for PLM.”Or, expressed from the opposite viewpoint: “PLM is the business context in which PDM is implemented

The response from Rick Franzosa caught my attention and I extracted the following text:

The reality is that most PLM systems are doing PDM, managing product data via BOM management, vaulting and workflow. In that regard, PDM [read BOM management, vaulting and workflow], IS the IT platform for the, in some ways, unfulfilled promise of PLM.

I fully agree with Rick’s statement and coming back to my introduction about making the case for PLM, we need to differentiate how we implement PLM. Also we have to take into our minds that no vendor, so also not a PLM vendor, will undersell their product. They are all promising J

Two different types of PLM implementation

Originally PLM has started in 1999 by extending the reach of Product Data outside the engineering department. However besides just adding extra functionality to extend the coverage of the lifecycle, PLM also created the opportunity to do things different. And here I believe you can follow two different definitions and directions for PLM.

Let’s start with the non-disruptive approach, which I call the extended PDM approach

Extended PDM

expressWhen I worked 6 years ago with SmarTeam on the Express approach, the target was to provide an OOTB (Out of the Box) generic scenario for mid-market companies. Main messages were around quick implementation and extending the CAD data management with BOM and Workflow. Several vendors at that time have promoted their quick start packages for the mid-market, all avoiding one word: change.

I was a great believer of this approach, but the first benchmark project that I governed demonstrated that if you want to do it right, you need to change the way people work, and this takes time (It took 2+ years). For the details: See A PLM success story with ROI from 2009

NoChange

Cloud based solutions have become now the packaging for this OOTB approach enriched, with the ease of deployment – no IT investment needed (and everyone avoids the word change again).

If you do not want to change too much in your company, the easiest way to make PDM available for the enterprise is to extend this environment with an enterprise PLM layer for BOM management, manufacturing definition, program management, compliancy and more.

Ten years ago, big global enterprises started to implement this approach, using local PDM systems for mainly engineering data management and a PLM system for the enterprise. See picture below:

clip_image002

This approach is now adapted by the Autodesk PLM solution and also ARAS is marketing themselves in the same direction. You have a CAD data management environment and without changing much on that area, you connect the other disciplines and lifecycle stages of the product lifecycle by implementing an additional enterprise layer.

The advantage from this approach is you get a shared and connected data repository of your product data and you are able to extend this with common best practices, BOM management (all the variants EBOM/MBOM/SBOM, …) but also connect the market opportunities and the customer (Portfolio management, Systems engineering)

myplmThe big three, Dassault Systemes, Siemens PLM and PTC, provide the above functionality as a complete set of functionalities – either as a single platform or as a portfolio of products (check the difference between marketing and reality).

Oracle and SAP also fight for the enterprise layer from the ERP side, by providing their enterprise PLM functionality as an extension of their ERP functionality. Also here in two different ways: as a single platform or as a portfolio of products. As their nature is on efficient execution, I would position these vendors as the one that drive for efficiency in a company, assuming all activities somehow can be scheduled and predicted

My statement is that extended PDM leads to more efficiency, more quality (as you standardize on your processes) and for many companies this approach is a relative easy way to get into PLM (extended PDM). If your company exists because of bringing new products quickly to the market, I would start from the PDM/PLM side with my implementation.

The other PLM – innovative PLM

idea

Most PLM vendors associate the word PLM in their marketing language with Innovation. In the previous paragraph I avoided on purpose the word Innovation. How do PLM vendors believe they contribute to Innovation?

This is something you do not hear so much about. Yes, in marketing terms it works, but in reality? Only few companies have implemented PLM in a different way, most of the time because they do not carry years of history, numbering systems, standard procedures to consider or to change. They can implement PLM in a different way, as they are open to change.

If you want to be innovative, you need to implement PLM in a more disruptive manner, as you need to change the way your organization is triggered – see the diagram below:

PLM_flow

The whole organization works around the market, the customer. Understanding the customer and the market needs at every moment in the organization is key for making a change. For me, an indicator of innovative PLM is the way concept development is connected with the after sales market and the customers. Is there a structured, powerful connection in your company between these people? If not, you do the extended PLM, not the innovative PLM.

Innovative PLM requires a change in business as I described in my series around PLM 2.0. Personally I am a big believer that this type of PLM is the lifesaver for companies, but I also realize it is the hardest to implement as you need people that have the vision and power to change the company. And as I described in my PLM 2.0 series, the longer the company exist, the harder to make a fundamental change.

Conclusion

There are two main directions possible for PLM. The first and oldest approach, which is an extension of PDM and the second approach which is a new customer centric approach, driving innovation. Your choice to make the case for one or the other, based on your business strategy.

Looking forward to an interesting discussion and see you in Munich where I will make the case

PLM_inno_2012

observation

Since the past six months I am involved in several discussions related to the (building) construction industry. If you look to this industry, it seems like this is one of the few industries without innovation in its processes.

Someone in the discussion even claimed that if a worker from the middle ages would come back to this century, he would be quickly adapt and understand the way people work. OK, there are some new tools and materials, but the way the building construction industry works has not changed.

And let’s look to productivity. Where in the past 60 years in all industries productivity has increased, I have seen a survey where productivity in this industry has not increased and even decreased a little.

clip_image001

Although the survey ends in 2003, another article caught my attention. Robert Prieto, Senior Vice President from Fluor Corporation wrote end of last year in Engineering News Record his viewpoint: Engineering-Construction Needs a New Model. Reading this article and the comments demonstrates there is a need for innovation in the building construction industry.

Failure costs up to 15 % and delayed deliveries are considered normal business in this industry, where if this would be applied to mid-market companies in the manufacturing industry, they would have gone bankrupt due to claims and lost profit.

If we look at this industry, the first excuse you hear is that every project is unique and that project execution is done by a group of loose connected suppliers, not really pushed to stay within the targeted budget. But you might ask yourself: what is the correct budget?

whyworryI noticed that in this industry when a project is estimated, suppliers are asked to deliver their bid and proposed solution based on their understanding. Usually the lowest bid wins the bid.

All participants are aware that not all requirements are clear, but no one wants to ask and invest further as to invest more in accurate cost estimation. This is not anticipated. It is about winning the bid with the lowest trouble and investment.

So who is to blame? First of all, the client who has a short term vision. By selecting the lowest bids and not pushing for in-depth analysis of the project delivery and operational costs in the long term, the situation will not change.

What if the client was using the basics of PLM – Product Lifecycle Management? For me PLM means a connection and sharing of the concept phase, the delivery phase, production phase and maintenance phase.

What I consider as strange is the fact that in the engineering and construction industry these four phases are not connected and often that the maintenance phase (operations) is not taken into account during the concept phase.

plm_constrAnd then there is the data handover. After engineering and construction specific data is handed over to the maintenance organization. What is the quality of the data, how applicable is it to the maintenance organization and how does it support maintenance is not clear. There is a disconnect and loss of knowledge as the handover is just based on the minimum data required.

What if the engineering construction industry would use PLM best practices, like:

  • Requirements Management – connecting, implementing and validating all the requirements from each stakeholder. Making sure all requirements are considered and negotiated in a structured manner – no excuse for surprises.
  • Data sharing with versions and status. Instead of a handover, data becomes mature during the lifecycle of the project. It requires the maintenance organization to be involved from the start
  • Standardized validation and approval processes related to requirements and data. These processes might be considered as an overhead but they are the ones that lead to quality, risk and cost management

Conclusion: I believe connecting the engineering and maintenance phase for engineering construction companies will lead to higher productivity and quality. For sure the initial engineering cost will be higher, but during the construction and maintenance phase these costs will be recovered and probable much more – here is the ROI

As my intention was to write shorter blog posts this year, I stop at this point and look forward to your comments for a further discussion.

YOUR THOUGHTS ??

Sorry for the delay between this post and the previous post. A break with a lot of PLM work on my side and no adverts on your side: win-win. But now I have time to continue the serial around PLM 2.0. We are in the middle.

A small recap on the agenda:clip_image002

First post : What is PLM 2.0? – published Aug 24th
Second post : Challenges in current PLM – published Sept 4th
This post : Change in business – published Oct 3rd
Final post : Why PLM 2.0 – conclusions


In the first post I described the changes in PLM messaging from vendors – PLM 2.0 or similar terms. In the second post I described the current challenges of PLM, which are well known – if you have access to in LinkedIn to the PLM related groups you will find discussions around the challenges of current PLM. And they set the spirit – good or bad.

Now in this post I will bring up some trends, which to my opinion, unmistakably must lead to a new way of PLM in order to adapt to the future.

Generation Y – a new generation of workers

clip_image004Generation Y: It is interesting to learn that everywhere companies are complaining or warning that their existing workforce is going to retire with all their knowledge without decent follow-up. In parallel they state it is difficult to find new employees with similar skills that will guarantee the future of the company. The new generation of workers, often identified as Generation Y, has different skills and different motivations.

Some interesting generalizations (note I am not a social anthropologist).

clip_image006The older generations were raised with the concept: Knowledge is Power – You as an individual needed to have in-depth skills to be the right person for a job – a job is your life and for life. As a negative result of this approach, you see that exactly this older generation sometimes ‘sits’ on their knowledge as a kind of job guarantee – they do not like sharing information – “Come to me and I will help you” is their motto till they retire.

Generation Y does not have this job for life attitude – they look more for short term success and fulfillment and therefore they do not fit so well in the way traditional companies work. They are not the type of knowledge workers previous generations had, but they are, thanks to their skills with modern digital media, capable of finding information and combining information into knowledge. They work different.

The interesting observation from my side is that Generation Y is exactly the type of people PLM requires, as it is all about sharing and combining data. What is blocking their acceptance for current PLM is that the implementation is not architected to their work motivation. Look at:

  • The way information is stored (too structured),
  • The way information is presented (too structured, boring screens).
  • The way information has to be entered in the system (too unfriendly – overkill)

clip_image008For them PLM needs to move more to an intuitive way of presenting information, capturing data as-if it is something like serious gaming. And the new PLM needs to have a way to manage structured and unstructured data combined.

For companies that complain, they are losing skilled workers in the future, they should not complain but adapt. They should look forward and solve the problems for the future, which means a different way of doing business and implementing PLM. Do not choose what the dinosaurs did.

New styles of business management

clip_image010Here I want to come back to my first post – I was intrigued by reading Steve Denning’s posts and its relation to PLM. Through the post Why Amazon can’t Make a Kindle in the USA, I found the post The Death and Reinvention of Management the best fitting with my PLM drive.
Steve describes five fundamental shifts in management that make companies ready for the 21st century.

Take time to read the post (and go more in-depth if you get as enthusiastic as me) – but come back to read the rest of this post

I summarize/quote the five shifts from Steve here (as I am sure not everyone has done the reading):

1. The company’s goal has to shift to one of delighting clients i.e. a shift from inside-out (“You take what we make”) to outside-in (“We seek to understand your problems and will surprise you by solving them”)

2. The role of the manager has to shift from being a controller to an enabler, so as to liberate the energies and talents of those doing the work and remove impediments that are getting in the way of work.

3. The mode of coordination shifts from hierarchical bureaucracy to dynamic linking, i.e. to a way of dynamically linking self-driven knowledge work to the shifting requirements of delighting clients.

4. There is a shift from value to values; i.e. a shift from a single-minded focus on economic value and maximizing efficiency to instilling the values that will create innovation and growth for the organization over the long term.

5. Communications shift from command to conversation: i.e. a shift from top-down communications comprising predominantly hierarchical directives to communications made up largely of adult-to-adult conversations that solve problems and generate new insights.

Here we see the typical PLM 2.0 targets. I will translate them into our PLM terminology.

Shift # 1 – The shift to delight clients – from which PLM vendor do we hear this statement? Which PLM vendor puts the customer in focus, instead of their “superior” technology?

Shift #2, #3 and #5 are typical PLM 2.0 capabilities which I described in my first post. See below the PLM 2.0 differentiators:

clip_image012

And where do we find shift #4? How do PLM vendors address this change beyond marketing?

My conclusion on this point: Both PLM and management require a change to be ready for the 21st century – It is exactly what generation Y is looking for, it is exactly what future consumers are looking for. However currently classical PLM and classical Management are dominating the thought process – and they do not like change so much as it would put past investments and achievements at risk.

The Importance of Social Media

clip_image014Already described in the two previous trends, social media concepts fit exactly in the shift that we see towards the future. It impacts the way companies change their marketing and address their customer base. In parallel it affects the ways teams collaborate in the product development space, innovation teams are global product development teams.

My thoughts: Social media might look like a hype, but the basic concepts of social media will be required for future PLM

Globalization for SMB

clip_image016The major trend from the past decade is that SMB’s (Small and Medium Businesses) do not longer serve and fight for a regional existence. Competition and customers come from everywhere and production is more and more outsourced. The traditional company that is #1 in their region does not longer exist. Even SMBs have to consider ways to collaborate globally – again another driver for PLM 2.0

My thought: Traditional SMBs are never the leading companies in new trends, they hang on their core knowledge and have probably a longer way to go to really adapt to the future. Startup SMBs with no historical hindrance are likely to outperform them.

Innovation, Intellectual Property & War on Patents

clip_image018In a global market, innovation is the key driver to be successful combined with the point above: delight the customer. In order to delight the customer you need to innovate as delightment does not come from commodities.

And with innovation I am not only addressing the consumer market, innovation is required in all areas: green products, green production as world climate and its population forces us to change.

The successful products for the future will be those that are bringing innovation and when your company owns this Intellectual Property, your near future is going to be profitable,

Therefore the “War for Patents” will be everywhere. We currently see in the news the tablet and Smartphone patents wars, but it pops up everywhere, some more visible than others.

A “War for Patents” costs a lot of money (mainly spent to lawyers). Therefore the balance should be found between protecting your IP and to innovate faster. In this way your patents become less relevant because newer exist. To my opinion the new PLM should be the engine for innovation first and secondly the system to protect your IP

Conclusion:

clip_image020Again too many words for a blog post, but the topic is huge and I hope you see the need for a different PLM (PLM 2.0): A PLM that is targeted to the change in business all around the world. The monetary crisis which is another symptom of the old business gives us a chance to change. We need to change organizations and collaboration to remain profitable in the future – don’t be an ostrich

My thoughts –looking forward to your feedback

%d bloggers like this: