Cloud Computing

Leonardo da Vinci (1452-1519)

IoT-Infused Innovation

By Cloud Computing, Innovation, PLM No Comments

The Innovator’s Myopia

Many product organization suffer from acute myopia. Once a product is sold or installed in the field, they lose sight of its performance, how users are interacting with it, and how well it supports the brand.

Of course, organizations do get some feedback from customers and field operations from time to time. But this information usually comes in the form of bad news: customer complaints, excessive warranty claims, and costly product replacements and repairs.

Upon careful observation, we should realize that organizational myopia doesn’t set during product deployment. It usually starts much earlier, when product marketing defines market needs and functional requirements for a new promising product.

Products are frequently defined and designed based on inaccurate, out of date, and biased perceptions about customer needs and competitive landscape. Product organizations are highly optimistic about customers enthusiasm to cope with yet another “disruptive” technology. And product designers often lack sufficient understanding of existing workflows and process integration requirements.

No wonder most new products fail. Read More

Dunes, Oceano (Edward Weston, 1936)

Leading by Example: Technology Companies Help Save Scarce Water Resources

By Cloud Computing, Internet of Things One Comment

This cup of steamy coffee you are sipping from right now while reading this article—how much water do you think was used to make it? You probably guessed it took a more than just 8 ounces of water, but I doubt you got even close to the actual number.  According to Trucost, a company that provides sustainability data, it takes some 135 liters of water to make one cup of coffee (since we just switched to metric units, one 8-oz cup is 0.25 of a liter).  Most of this water was used to grow the coffee beans and soak them during processing.

And it’s not only food processing that uses vast quantities of water. By Trucost’s calculations, about 3,900 liters of water are consumed during the manufacturing of a single t-shirt. And nearly 13 tons of water are required to manufacture a single smartphone, with nearly half of it due to pollution and cleanup during manufacturing and assembly. Read More

The persistence of memory

The Return of the Real-Time Enterprise

By Cloud Computing, Internet of Things No Comments

Real Time?

The term real time was originally used to imply a predictable and guaranteed response time to computer-generated or observed events. For example, a real-time process control system is architected to respond to readings from sensors and switches within a predefined latency in order to keep a process going, respond to alarms, and so forth. In other words, real time doesn’t necessarily mean “extremely fast”; it merely means “fast enough” for the purpose of the process it controls. Of course, in industrial applications that may mean within a few milliseconds, but the point is that real-time systems are optimized for timing predictability, whether measured in milliseconds or minutes.

Somehow, over the last couple of decades, real time became to mean “very fast.” Until recently, we didn’t think of Internet connectivity and cloud-based apps as being capable of very fast response time. We certainly know from everyday experience that response time isn’t consistent and definitely not predictable.

But the improved throughput of wired and wireless IP networks and abundance of Internet resources is improving both speed and response-time predictability of cloud-based applications.

Read More

Product Innovation Congress

By Cloud Computing, Innovation, IT Strategy, PLM No Comments

Product Innovation Congress 2014, San Diego

Another very successful Product Innovation (PI) Congress was held last week in San Diego. Don’t let the very slow trickle of tweets from the event mislead you: the organizers put together a very full agenda that kept the delegates engaged throughout the two-day conference. Plus, I don’t think that engineers are very much into twitting anyway.

Instead of a detailed chronology of the event, which you can get from reviewing the agenda, I chose to highlight some key points and use them to offer commentary and observations about the state of our industry.

PLM Market Activity

There appears to be much activity in selecting, replacing and upgrading PLM software. Some were first time PLM buyers, but there were a surprising number of companies expressing dissatisfaction with the exiting solution and seeking a “better” PLM system. I did not conduct a structured survey, but anecdotally it appears that a good number of those in search of a PLM replacement are users of ENOVIA SmarTeam and ENOVIA MatrixOne.

My observations:

The quest search for a “better” PLM system will continue to drive activity and put pressure on PLM vendors to deliver greater value in enhanced functionality, lower cost, faster deployment, and new delivery and ownership models. The move of reluctant PLM vendors such as Oracle Agile to offer a cloud delivery model is but one recent example and I except other PLM vendors are in the process of following suit. This dynamic keeps the door open for vendors such as Aras PLM that continues to challenge the hegemony of the incumbents.

That being said, buyers should realize that the PLM software itself isn’t a substitute or remedy for flawed and suboptimal product development processes. For each dissatisfied PLM user company you will find many others who are highly successful and are able reap the full potential of the very same PLM software. It isn’t the software. It’s you. Don’t blame the vendor.

PLM Implementation

Most of the conference presentations made by PLM practitioners from product companies were interesting, but more often than not the insight and recommendations offered by the speakers were what one might consider General Project Management 101; not even PLM 101.

My observations:

The growing complexity of product development processes and the commensurate expectations from PLM software are challenging product organizations. The recent interest in incorporating the embedded control software development process under the PLM umbrella further complicates the issue.

We need to elevate our view of PLM from a product data management (PDM) and business process automation software to a portfolio of processes, best practices and tools that rich create context for optimizing complex multidisciplinary product related decisions.

Data Migration

PLM and CAD data migration is typically an unpleasant part of engineering software replacement and major upgrades. Judging by the number of delegates interested in the topic, the issue of data interoperability and backward compatibility continues to plague the space.

My observations:

The ability to retain and access product data is critical. However, product organizations are challenged to maintain data compatibility and interoperability across engineering and business tools with negative consequences in many business critical activities, from data retention for compliance to design reuse.

Application Lifecycle Management (ALM)

Chairing the Application Lifecycle Management (ALM) track, I spent much time in ALM discussions. While the density of embedded software in a broad range of products in most industries continues to increase, software development tools and practices have not kept up with the need to handle with increasingly stringent time, budget and quality goals.

A panel discussion with Ford, Hologic, BigLever and IBM (Rational) seemed to gravitate towards the realization that engineering methods and practices, coupled with organizational culture, keep ALM as a separate engineering discipline with its dedicated task specific tools. The panelists also explored, albeit briefly, the notion of a federated software development process environment and ALM functionality delivered as a collection of services.

My observations:

We are likely to continue the PLM vs. ALM debate, partly because of the highly visible position PTC is taking with its Integrity software. But I do not expect to see any significant change in the way organizations use contemporary ALM tools to coordinate and synchronize the software, hardware and mechanical development cycles.

My current research interest is in using product line engineering (PLE) as an alternate way to decompose product architecture and realize a federated software development process environment that maximizes the utility of  task-specific software development tools.

Internet of Things (IoT)

As in many conferences today there was the obligatory Internet of Things (IoT) keynote address. I suspect I am in the minority here, but I found the “The Silent Intelligence: The Internet of Things” presentation to be plagued by the all-too-common trivial use cases, generalizations and inaccuracies, and insufficient real-world reality checks.

On the other hand, the review of the Masdar City project: “Smart Cities in Advancing Global Renewable Developments” was fascinating.


See you next year in Boston.


EMC: Large Enterprises Reduce Investments in Public Clouds

By Cloud Computing, IT Strategy No Comments

By 2016, Only 12% of Workload Will Run On Public
Cloud Infrastructure

This was the surprising perspective offered by Adrian McDonald, EMEA President at EMC, addressing the audience at EMC Forum 2013 that took place on November 4 in Tel Aviv, Israel. McDonald maintains that public cloud architecture is not the right solution for large enterprises, and, in fact, CIOs are reporting that when considering the investment in security, compliance and business continuity, public cloud infrastructure is more expensive than the alternatives.

According to McDonald, market research conducted by EMC shows that large organizations are gradually reducing application deployment on public clouds. EMC forecasts that by 2016, 12% of the workload will run on a public cloud and 12% will be using a private-virtual cloud, but 76% of the workload will require internally managed infrastructure.

McDonald estimates that organizations can cut as much as 38% of annual IT expenses through virtualization and cloud deployment. But he cautions that this is an aggressive goal that requires both IT organizations and cloud infrastructure and services providers to offer new ways to deliver flexible and agile solutions and services, such as supporting customer self-provisioning.

As reported by Ran Miron