Can PLM Software Benefit a Small Company?

By Manufacturing No Comments

In my market research in PLM, PDM and related fields, and my consulting work with engineering organizations, including those that we refer to as SMBs: small- to mid-size businesses, I frequently find that they tend to think about PLM as a tool ideally suited for large organizations with sizable engineering teams designing complex highly engineered products.

Looking at the profile and size of engineering companies using PDM software, especially those showcased by mainstream PDM and PLM vendors, one might easily reach the conclusion that these systems are, indeed, designed with the “big guys” in mind. This perception may be reinforced by PLM and ERP vendors that have announced products designed for the SMB market and abandoned them a few years later, when rosy revenue expectations weren’t achieved. Remember, for example, PTC’s ProductPoint and SAP’s Business By Design?

Small engineering teams have come to think of PLM software as unnecessarily complex and limiting operational flexibility, not to mention the high cost of the software, IT overhead, and the pain of keeping the software up to date.

In part, this perception is underscored by enterprise software vendors that use the same approach to design products and licensing terms for large companies and SMBs; they think of SMBs as though they were just like large enterprises, only smaller. Reminds me of the medieval paintings showing babies as miniature adults.

SMBs and small engineering teams tend to cultivate a culture of informal, open and flexible workplace.  They frown upon the Byzantine organizational structure of some very large traditional product companies and the inflexibility imposed by formal PDM tools they use. Instead, SMBs exploit small team size, flexible and nimble culture, and the skills and capabilities of individual engineers to manage product development using a minimal set of data management tools.

However, despite the stodgy and bureaucratic culture—or perception thereof—of some traditional product companies, many of them are undoubtedly successful; and they use PLM tools effectively to conceptualize, design and manufacture innovative and profitable products.

So is it conceivable that some of the practices employed by large enterprises could benefits SMB? That adopting some aspects of product data management discipline might help SMB be more efficient and resilient?

I am going to explore this topic in a new blog series Is PLM Software Only for Big Guys? In which I will discuss processes and best practices employed by large product design and engineering organizations, and how they use PDM to make better product related decisions. In particular, I am going to discuss those practices that I believe smaller organizations should consider adopting.

The first blog post Check-in, Check-out, Check-in…Why bother? I know where my CAD Files Are! lists some of topics I am working on. Please comment and suggest additional CAD file management and product development topics you think small and medium-size engineering organizations should consider.

Mack Trucks

Mack GuardDog Connect

By Automotive, Telematics No Comments

Telematics as a Service

Last October, Mack Trucks launched Mack GuardDog Connect, a telematics service based on Mack’s GuardDog onboard diagnostics (OBD) monitoring system. The service has been in operation now for over 6 months, and I thought it would be a good time to revisit the project team and discuss with them the business benefits as Mack sees them.

Mack Trucks’ GuardDog Connect is a telematic system that monitors the truck’s OBD fault codes and alerts the driver via the Mack Copilot display on the dash. It then transmits the information to GuardDog Connect’s 24×7 customer care center, which advises the driver if the truck can continue operating and have the issue repaired at its next scheduled service interval, or, if an immediate repair is needed, it schedules the repair to be performed at the nearest service location and orders the necessary parts if they are not on hand.

Many conversations and predictions about connected cars and telematics turn too quickly to the promises of proactive and predictive maintenance, whereby a maintenance activity is scheduled in advance, before an actual failure has occurred and observed.

“Predictive diagnostics” is far easier to describe than to implement reliably and economically. The behavior of complex engineered equipment like a truck – both during normal operation and when a subsystem failure occurs – is not static: it’s constantly changing throughout the truck’s life due to normal wear and tear, operating patterns, modifications, maintenance practices, and numerous other factors. Consequently, any predictive diagnostic system true to its name must take into account the configuration and the state of the specific equipment being monitored, and failure information and statistical models have to be updated continually – manually or programmatically – throughout the life of the truck. Consequently, deploying a commercially viable predictive maintenance system is extremely challenging.

By combining real time OBD information and a live contact center, Mack is able to offer truck owners and operators a pragmatic telematics service. Currently, nearly 5,000 Mack trucks are registered in the program that cost from $200 to $600, depending on the length of the commitment.

Success Metrics

Mack’s goal for GuardDog was simple: improve trucks’ uptime and reduce business disruption through early notification and proactive repair scheduling. For dealers, GuardDog streamlines service throughput and boosts customer satisfaction.

Mack reported the following benefits experienced since GuardDog Connect was first put in service:

  • 70% reduction in diagnostic time
  • 25% improvement in overall repair time
  • First Time Fix better than 90%
  • Uptime improvement of 1 day per service event

Although Mack did not measure the potential impact on warranty repair costs, the improved diagnostic accuracy and First Time Fix (FTF) rate achieved by this service are strong indications that in the long run GuardDog Connect will contribute to reduction in warranty expenses.

Mack sees additional value in using information from GuardDog Connect to inform product improvement activities, driving ongoing improvement in diagnostic accuracy and service efficiency, and improve the OBD software.

 

PLM Service Providers Ready To Deliver Greater Value

By IT Strategy, Mergers & Acquisitions, PLM No Comments

What Do Recent Mergers and Acquisitions of PM Services Companies Mean for Manufacturing Companies?

Recently we have been witnessing a wave of mergers and acquisitions of PLM services companies. Here are some examples, listed chronologically:

  • In October 2013, Accenture announced the acquisition of the PRION Group – a consulting and systems integrator that specializes in Siemens PLM software.
  • Later that month, Accenture announced plans to acquire PCO Innovation, another PLM consulting group.
  • In April, 2014, KPIT Technologies reported the acquisition of I-Cubed, a PLM product and services company specializing in PLM data migration. Only a few months earlier, I-Cubed acquired Akoya, a should-cost analytic software company.
  • On May 21, 2014, Kalypso, an innovation consulting company announced it had merged with PLM consulting firm Integware,

I don’t think the increased activity in of mergers and acquisitions focusing on PLM services, or the fact that Accenture all of a sudden is paying attention to PLM is a mere coincidence. Rather, it is an indication of a gradual change in how enterprises view product lifecycle management, the role of PLM software in the enterprise, and, with those, new opportunities for PLM related growth. Read More

Should Cost Analytics

By Manufacturing, Mergers & Acquisitions No Comments

How to Figure Out the True Cost of a Manufactured Part?

Manufacturing companies in today’s global economy rely on an intricate network of global and local suppliers. With typically more than 50% of operational cost is tied up in their supply chain, manufacturers must closely manage and continually optimize supply chain operations, balancing quality, cost, risk and resilience.

Research shows that in a typical manufacturing company, as many as 30% of purchased parts are not priced optimally: either suppliers are charging excessively for parts that can be sourced elsewhere under more competitive terms, or market competition and aggressively negotiated supplier contracts have resulted in lower quality parts and greater supply chain risks. Furthermore, it is common to find identical parts sourced in small quantities from multiple suppliers, reducing negotiation leverage, bloating inventories and introducing further waste into the supply chain.

In some markets, multiple vendors and strong competition may suffice to drive down prices and ensure high quality and level of service. But in markets where there are only a few suppliers, buyers’ options are limited and optimizing supply chain decisions can be difficult.

While these challenges are well recognized, making effective part sourcing decisions and negotiating optimal pricing aren’t easy, and most manufacturers do not have an objective and consistent means to rationalize supplier relationships.

The ability to determine the true cost of a part in a systematic fashion gives both manufacturers and suppliers critical tools that should be utilized during design, sourcing and bidding activities. Below are some use cases and examples of how “should cost” analytics can be used during key product lifecycle phases.

 Supply Chain Optimization

As stated in the foreword, more than half of operational spend of large manufacturing organizations is tied up in the supply chain. Reducing the number of suppliers and optimizing contracts for each supplier’s capabilities can help companies reduce supply chain waste, manage inventory costs and improve overall operational efficiency of their supply chain.

Everest Institute research estimates that companies can achieve 22-28% cost savings by utilizing their existing supplier base instead of adding suppliers and rebidding contracts:

  • 35-40% one-time cost reduction by avoiding setup and on-boarding
  • 20-25% reduction in operations and internal overhead

Akoya, a part-costing data analytics software company, conducted an analysis of cast parts at a large manufacturer of heavy equipment. The analysis of 1,137 cast parts from 39 different suppliers showed that 24% of the parts were priced 25% or higher than they should. The analysis revealed that selecting lower cost suppliers and renegotiating fair prices for those parts would result in annual cost savings of approximately $21M. The figure below is of a typical analysis, showing clusters of similar parts that are priced above or below the average for that class of parts.

Should Cost Analysis

(Source: Akoya)

 

 

 

 

Bidding and Contracting

Shifting the discussion from buyers to suppliers, many suppliers do not have a systematic and reliable method to estimate the cost to produce a product. All too often they resort to a ballpark cost estimate and adding a lump sum percentage for overhead. In competitive situations, these suppliers may quote high price and lose the bid, or, potentially worse, their price will be low enough to win the contract but will a negative impact the profitability of the deal.

A data-driven analytic approach to manufacturing cost estimate reduces the time and effort to respond to request for cost proposals, provide accurate appraisal of actual cost and profit margins, and support the evaluation of design and manufacturing alternatives, volume pricing, and the like.

And the same approach benefits those that evaluate supplier responses: identify excessive price quotes – whether too high or too low – and help in selecting the best suppliers to conduct business with.

 Product Design

Multiple studies that show that demonstrate that most of a product manufacturing cost is determined during early design phases have been around for decades, yet they are generally ignored until a cost takeout campaign is initiated, at which point the manufacturer has already incurred significant loss and the ability to optimize cost decisions is very limited.

Accurate cost information can be beneficial in a number of design engineering activities, such as:

  • Input for manufacturing cost analyses, weighing alternative sources and manufacturing methods before the design is frozen.
  • ECO management: assessment of cost ramifications of a design change or switching to a different supplier.
  • Cost reduction / cost take-out campaigns.

What is 3D Part Cost Analytics?

The first question that comes to mind, then, is how to determine the true manufacturing cost of a part, especially if that exact part has never been manufactured before.

Advanced 3D cost analytics is based on a part’s 3D CAD model. By analyzing the key features of a design: dimensions, tolerances, weight, etc., and of the manufacturing processes: machining, drilling, heat treating, etc., and using a detailed database of various manufacturing processes, industry standards, and associated cost, analytic software can estimate the target cost of making a part based on the market price of similar parts.

Activity based costing is an alternative method for estimating part manufacturing cost. It identifies the manufacturing activities involved in manufacturing the part, such as casting, stamping, forging, drilling and finishing, and uses standardized labor, machinery and overhead costs to calculate the actual manufacturing cost of that part.

Market prices are derived from a broad range of sources. These include a company’s supply base, supplier catalogs, comparisons of supplier responses to bid requests, and company specific design rules and “should cost” target guidelines.

Both methodologies have value and can be used to complement each other. Whereas 3D part analytics focuses on a “bill of features” to identify like parts, activity based costing uses a “bill of activities” to do the same.

Recommendations

Instead of the periodic but infrequent cost takeout and supplier rationalization campaigns, manufacturing companies should employ “should cost” analysis as an ongoing best practice. Using a structured approach and analytic tools, manufacturers should be able to introduce cost and supply chain consideration earlier in the product design, negotiate fair prices with their suppliers, and achieve greater efficiency and risk resilience in their supply chain.

“Should cost” models are not designed to be completely accurate, nor should they be used as the only decision criterion in selecting a supplier. They need to identify areas of cost optimization opportunities and help identify and assess alternatives for cost savings and supply chain optimization.

Not all manufacturing costs are controllable. A “should cost” analysis helps identify areas of cost that can be improved such as over-specification of tolerances is a major driver of cost. The analysis can identify existing designs and inventory parts that can meet the design specifications – possibly restated – at lower cost.

Obviously, an optimal design and efficient supply chain aren’t only about driving suppliers’ cost down. In fact, over-leaning the supply chain by focusing on lowest cost suppliers, pressuring supplier profits, and implementing very lean just-in-time inventory strategy will likely introduce unnecessary risks and result in a fragile supply chain.

Suppliers can be a great source of cost reduction innovation. This is a significant source of cost savings, and one that is typically overlooked by traditional procurement organizations.

Why Now?

This topic isn’t new. You can find blog discussions dating back several years that followed the regular hype cycle of analysts and bloggers discussions: they start in a flurry and then die very quickly so we can free up the blog space for the new hot topic de jure. But two recent acquisitions might bring conversations on “should cost” analytics and other PLM activities that were relegated to a back seat role. In March, Akoya, a “should cost” analytic software company was acquired by I-Cubed. Subsequently, I-Cubed’s PLM business was acquired by KPIT, an India-based global IT consulting and product engineering company.

 

On PLM (In)Compatibility

By Automotive, PLM 2 Comments

Long-term Product Data Retention

I have been doing some work lately around issues of long-term product data retention: how long companies retain product data, the laws and policies – existing and upcoming – that regulate these practices, and whether the frequent releases of CAD and PLM software impact the ability of product data owners, and potentially regulators, to access legacy data.

One question, of course, is the backward compatibility of PLM, PDM and CAD software tools. Some of the comments on company websites and blogs are surprising and even entertaining (unless, of course, it’s your data that does not load after a major upgrade.) Here are some verbatim examples, with software tools and vendor names removed.

  • I understand your frustration, but [CAD Tool] is not built in a way that would make backwards compatibility practical.” CAD expert.
  • [PLM Vendor] supports use of software 3 versions back… I do understand the request to support backward compatibility of files but this is not something that has been supported in the history of [CAD Tool].” Sr. Subject Matter Expert at [CAD Software Vendor].
  • I just want to make sure I have this right. I spent a bunch of money upgrading to 2012 and now I can’t share drawings with my consultants, many of whom use older versions of [CAD Software]?” CAD user.
  • Anyone who has ever attempted to move from one version of an enterprise software application to the “new improved” version knows that upgrades can be painful and costly. The critical elements of any enterprise software upgrade are data compatibility, ease of deployment, and compelling business benefits from the upgrade. Data compatibility is something that cannot be an afterthought. It has to be designed into the new version of the software. Ease of deployment of complex enterprise software requires robust and flexible state-of the-art architecture. It requires hardware and networking considerations. It requires that the software solution has captured the customer business process it is trying to transform. And above all it demands a unique long term partnership with the software vendor.” From a PLM vendor’s blog.

You may recall the wave of PLM system changeovers made by major automakers back in 2010-2011, as reported by Automotive IT:

Upheaval in the PLM market: Daimler and Chrysler are separating from former partner Dassault Systemes and are doing development work with Siemens PLM software. On the other hand, BMW has settled on Dassault in the E/E domain. In turn, Hyundai has chosen PTC’s Windchill as its collaboration software, making it the backbone of its PLM strategy. The PLM market is on the move in a big way.

While such dramatic changes are less likely to happen again, there is a lingering pain of periodically upgrading PLM and CAD software, and exchanging CAD models across the fragmented design collaboration chain. Users feel new releases happen too frequently, and often they are not sure about the value the new software provides. They want to know how long, how disruptive, how much will it cost, what’s the value? From an ongoing survey I am running, it appears that companies often skip a version or two (sometimes more) before they endeavor an upgrade.

And then, not all companies bother to upgrade data of older products and all prior versions of current product. The effort does not seem worthwhile.

The potential risk in not upgrading all data during every single upgrade is, of course, that sooner or later some CAD models won’t load into the PLM system. And it’s not necessarily the fault of the PLM software itself. As I am getting deeper into the intricacies of CAD data structure and interdependencies, and the workflow and data management practices of design and engineering organizations, I am discovering how easy it is to create data errors and structural inconsistencies that don’t manifest themselves for a while, until one day the model you just checked out will not check in, or the newly released PLM software refuses to load a model that you thought was perfectly fine. Discussing these issues in detail is outside the scope of this blog entry, but you can get an appreciation for the types of problems often found in CAD models in the highlights of data analysis conducted during a project to consolidate four PDM systems:

  • Approximately 20,000 missing dependencies. Data owners were able to recover only locate about 75%. The rest had to be recreated manually.
  • Approximately 25,000 duplicates. File/model names had to be rationalized and renamed manually.
  • Approximately 1,000 missing files. Unrecoverable.

A Question of Compliance?

One critical aspect of maintaining access to historical product data and, therefore, software backward compatibility, is the need to comply with relevant guidelines and regulations. The survey I am conducting shows that while some companies are aware of these requirements and, in fact, expect more to come in the future, other companies are unsure if and how they are subject to data retention and access guidelines. Below are a few a few examples of regulatory guidelines to consider. While the relevance and impact in your business may vary, these guidelines evolve quickly and it’s safe to assume that requirement to track product data throughout its lifecycle as long as the product is in use will only continue to expand.

EU Directive 2011/65/EU

RoHS Manufacturers or their “authorized representative” must submit technical documentation (to substantiate compliance) upon request of a member state enforcement agency, and retain such documentation for 10 years after a covered product is placed on the market.

FAA AC 20-179

Data retained by you must be made available to the FAA when requested. The FAA may use the project data retained by you for any official purposes such as production inspections, technical oversight of designees, design reviews, continued operational safety oversight, or any other reasons deemed necessary by the FAA.

You are also responsible for providing the FAA with all data in a format that is readable by the FAA. In the case of legacy data, you must be able to retrieve the data (or transfer it to other media that will be retrievable). When the data is transferred to another media, you must ensure the FAA has the means to access all previously submitted data as well as new data submitted to the FAA.

H3 DoD 5015.2

RMAs [Record Management Application] shall provide the capability to access information from their superseded repositories and databases. This capability shall support at least 1 previously verified version of backward compatibility.

Ongoing Research

Product companies should realize they need a product data retention strategy that ensures perpetual access, software compatibility and compliance not only while the product is in volume production, but  possibly as long as it is being used. This strategy requires an alignment with your PLM/PDM and CAD software vendors release planning in order to prepare for data migration and verification if needed. And while you are at it, it might be beneficial to revisit your CAD data management workflow to reduce the occurrences and severity of data errors.

Further reading