You are currently browsing the tag archive for the ‘Boeing’ tag.
A week later after the PLM Innovation conference in the US, I have time to write down my impressions. It was the first time this event was organized in the US, after having successful events the past years in Europe. For me it was a pleasure to meet some of my PLM friends in reality as most of my activities are in Europe.
With an audience of approximate 300 people, there were a lot of interesting sessions. Some of them in parallel, but as all session are recorded I will soon catch up with the sessions I have been missing.
My overall impression of the event: Loud en Positive, which is perhaps a typical difference between the US and Old Europe.
Here some impressions from sessions that caught my attention
Kevin Fowler, Chief Architect Commercial Airplanes Processes and Tools from The Boeing Company presented the PLM journey BCA went through. Their evolution path is very similar to the way Siemens and Dassault Systemes went through (driven by Boeing’s challenges).
Impressive was the amount of parts that need to be managed aircraft (up to a billion) and with that all its related information. Interesting to see that the amount of parts for the 787 have strongly decreased.
After PLM Generation 1 based on Teamcenter and Generation 2 based on Dassault Kevin demonstrated that functionality and cost of ownership increased due to more complexity, it was evident that usability decreased.
And this will be a serious point of attention for Generation 3, the PLM system BCA will be selecting for 2015 and beyond. Usability has to increase.
And as we were among all the PLM vendors and customers, during the breaks there was a discussion, which PLM vendor would be the preferred next partner for PLM. I had a discussion related to PLM vision and visibility with one of the SAP partners (DSC software Inc.). He is convinced that SAP provides one of the best PLM platforms. I am not convinced as I see SAP still as a company that wants to do everything, starting from ERP. And as long as their management and websites do not reflect a PLM spirit I am not convinced. In 2015 I might be wrong with my impression that PLM, Usability and SAP are not connected.
Note: browse to this SAP PLM rapid-deployment solution page and view the Step by Step guide. Now the heading becomes SAP CRM rapid-deployment solution. A missing link, marketing or do they know the difference between PLM and CRM ?
Next Nathan Hartman from Purdue University described his view on future PLM which will be model-based and he presented how PLM tools could work together describing a generic architecture and interfaces. This is somehow the way the big PLM Vendors are describing their platform too, only in their situation more in a proprietary environment.
- Nathan gave an interesting anecdote related to data sharing. He mentioned as example a 3D model that was built by one student and he asked another student to make modifications on it. This was already a challenge and even working with the same software lead to knowledge issues, trying to understand the way the model was built. Demonstrating PLM data sharing is not only about having the right format and application, but also the underlying knowledge needs to be exposed
Monica Schnitger, as business analyst presented her thoughts on PLM justification. Where in Munich I presented Making the case for PLM session, Monica focused on a set of basic questions that you need to ask (as a company) and how you can justify a PLM investment. It is not for the big kids anymore and you can find her presentation here (with another PLM definition).
I liked the approach of keeping things simple, as sometimes people make PLM too complex. (Also as it serves their own businesses). Monica presented that a company should define its own reasons for why and how PLM. Here I have a slight different approach. Often mid-market companies do not want PLM, they have pains they want to get rid of or problems that they want to solve. Often starting from the pain and with guidance from a consultant companies will understand which PLM practices they could use and how it fits in a bigger picture instead of using plasters to fix the pain.
Beth Lange, Chief Scientific Officer from Mary Kay presented how her organization, operating from the US (Texas), managed a portfolio of skin care products sold around the world by an independent local sales force all around the world. In order to do this successfully and meet all the local regulatory requirements, they implemented a PLM system where a central repository of global information is managed.
The challenge for Mary Kay is that from origin a company with a focus on skin care products and an indirect sales force, where sometimes the sales person has no IT skills, this project was also a big cultural change. Beth explained that indeed the support from Kalypso was crucial to manage the change. Something which I believe is always crucial in a global PLM project where the ideal implementation is so different from current, mainly isolated practices.
As regulatory compliance is an important topic for skin care products, Beth explained that due to the compliancy rules for China, where they have to expose their whole IP, the only way to protect their IP was putting a patent on everything, even on changes.
Would NPI mean New Patent Introduction in the CPG market ?
Ron Watson, Director, PLM COE and IT Architecture
from Xylem Inc. presented their global PLM approach. As the company is is relative young (2011) but is a collection of businesses all around the world, they have the challenge to operate as a single company and sharing the synergy.
Ron introduced PDLM (Product Data Lifecycle Management) and he explained there was first a focus getting all data under control and make it the single source for all product data in a digital format, preferably with a minimum of translation needed.
Here you see xylem has chosen for an integrated platform and not the best of breed applications. After having the product data under control the focus can be on standardizing processed overall the company. Something which other companies that have followed this approach, confirm it brings huge benefits.
As it was a PTC case study, Graham Birch, senior director of Product Management from PTC did the closing part. Unfortunate by demoing some pieces of the software. A pity as I believe people do not get impressed by seeing some data on the screen they recognize. Only when there is a new paradigm to demonstrate related to usability I would be interested.
And as-if they have read my mind, Daniel Armour from Joy Global demonstrated the value and attractiveness of 3D Visualization tools in their organization. Joy Global is manufacturer of some of the biggest mining equipment and he demonstrated how 3D Visualization can be used in the sales and marketing process, but also during training and analysis of work scenarios.
His demonstration showed again that 3D as a communication layer is attractive and appeals to the user (serious gaming in some cases).
As it was a SAP case, I was surprised to hear the words from Brian Soaper, explaining the power of 3D for SAP users and how SAP users will benefit from better understanding, higher usability etc. Iw as as-if a 3D-CAD/PLM was talking, was this a dream ?
I woke up out of this dream when someone from the audience asked to Daniel how they would keep the visualizations actual, is there a kind of version management ? Daniel mentioned currently not but you could build a database to perform check-in/checkout of data. Apparently all the 3D we have seen is not connected to this single database SAP always promotes.
Peter Bilello, CIMdata’s president had a closing session with the title: Evaluate the tangible benefits from PLM can prove complex, which indeed is true. Peter’s presentation was partly similar to the presentation he gave early this year in Munich. And this is what I appreciate about CIMdata. Some people in the audience mentioned that many times it is the same story and many of the issues Peter was presenting are somehow known facts. And this is what I like about CIMdata, PLM is not changing per conference or new IT-hype. If you want to understand PLM, you need to keep to the purpose and meaning of PLM. And these known facts apparently are not so known, a majority of PLM projects are executed or lead by people that decided to invent the wheel,as inventing the wheel seems cheaper than renting a wheel, and this lead again to issues later that every experienced consultant could foresee.
The evening with a champagne reception on the paddle boat making a tour around the lake and a dinner at the lake side concluded this first day.
The combination of presentations, scheduled network meetings and enough network time made it a successful first day
Next day I started with a BOM management Think Thank were in the target was to come to some common practices and understanding of BOM management. As the amount of participants was large and the time was short we only had a chance to touch the surface of the cases brought in.
What was clear from this session to me is that most challenges reported were due to the fact that the tools were already in place and only afterwards the PLM team (mostly engineering people) had to struggle to make it into a consistent process. They do not get a real help from PLM vendors or implementers, as their focus is on selling more tools and services.
What is missing for these organizations is a PLM for Dummies methodology guide, which is business centric instead of technology centric. For sure there are people who have published PLM books, but either they are not found of relevant. And as nothing comes for free, these companies try to invent the wheel again. PLM is serious business.
The first keynote speech from the second day was from Dantar Oosterwal, Partner and President of the Milwaukee Consulting Group, who inspired us with Lean and PLM: Operation Excellence and this all related to his experiences with Harley Davidson.
It was interesting how described the process of focusing on the throughput to get market results. There are various parameters how you can influence market share, by a price strategy, by increased marketing , but the most impact on Harley Davidson sales result was the effect of innovation. More model variants being the choice for more potential customers. By measuring and analyzing the throughput of the organization an optimal tuning could be found.
Dantar also shared an interesting anecdote about an engineer that had to study the impact of ethanol as fuel for a certain engine. And after a certain time the engineer came back with the answer: yes we can. He answered the question but left no knowledge behind. Where a similar question was asked about performance to a supplier and he came back with an answer and graphs explaining where the answered was based upon. This answer created knowledge as it could be reused for similar questions. It is a good example how companies should focus on collecting knowledge in their PLM environment instead of answers on a question.
The second keynote speaker was from the world biggest brand, Christopher Boudard, PLM Director from the Coca Cola Company. With its multiple brands and global operations it is a challenge to work towards a single PLM platform. He explained that at this stage they are still busy loading data into the system, where a lot of time is spent on data cleansing as the system has only value when the data is clean and accurate.
And this requires a lot of motivation for the PLM team to keep the executive management involved and sponsoring a project that takes five years to consolidate data and only then through the right processes make sure the data remains correct.
Christopher demonstrated in a passionate manner that leadership is crucial for such a project to be successful and implemented. For me as an European it was interesting to see that the world biggest brand PLM Director is a French citizen inspiring the management of such an American company.
Monica Schnitger conducted an interesting session about the state-of-the-state of multi-platform PLM.
If you cannot understand this tittle, it was a debate between the PLM vendors ( Aras, Autodesk, Dassault Systemes, PTC, SAP and Siemens) about openness, interoperability, cloud and open source.
After the first question from Monica about the openness of each of the vendor’s systems, it was clear there are no problems to expect in the future. All systems were extremely open according to the respondents and I lost my attention for the debate somehow as I had the feeling I was listening to an election debate. Monica did her best to make it an unbiased discussion, however I feel when some people want to make a specific point and use every question to jump on that it becomes an irritation.
Chad Jackson, this time dressed up as the guy that is always killed in the first 5 minutes of a Star Trek episode, shared with us the early findings of the 2012 State of PLM. Tech4PD followers, and who is not a follower, understood he lost the bet of the second episode.
Chad let me know if this picture needs to be removed, as it can kill your future career.
The preliminary findings Chad was sharing with us that manufacturing and service where significant interested and consumers of PLM data, but do not consider it as their data, where they have to contribute too also. The fact that it is available makes them involved in using the data, still these departments do not show active participating in PLM. Somehow this confirms the observation that PLM is still considered as an engineering tool, not as an enterprise wide platform.
As the initial group of participants (n = 100) is small and not random selected from an overall population, the questions remains what the state of PLM is in 2012. I assume Chad will come back on that in a later time.
The last plenary sessions, David Karamian from Flextronics and Michael Grieves, a Virtual Perfect Future, had the ungrateful position being the last two speakers of this event. I have to review David’s presentation again as it was not easy to digest and recall a week later what were his highlights. Michael’s presentation was easier to digest and I also believe with the new upcoming concepts and technology the virtual perfect future is there.
Looking back on a successful event, where I met many of my PLM peers from across the ocean, I will take the upcoming weeks to review the sessions I missed. Final good news for all PLM mind sharers, is the fact that CIMdata and MarketKey announced the coordination of their upcoming events next year – more content and more attendees guaranteed.