Blog: The perfect storm

The perfect storm

If you have wondered why it is now we are experiencing all the hype about the so-called Internet-of-Things (IoT), and think it is “old wine in new bottles” here is a few pointers of why the hype never materialized 10-15 years ago, when pervasive computing or ubiquitous computing was hot in research.

The storm consists of tech mergers, new business models, lifelong product improvement and analytics

The opportunity of IoT driven business is to a wide extent based on three tendencies: Cost, capabilities and convenience. Although the cost of sensors, bandwidth and processing has continuously been diminishing and more advanced functionality made possible, the history of technology is littered with countless examples of product introductions which did not live up to expectations set from the announced performance (IBM’s OS/2) or were ahead of time (Apple Newton).
The lessons learnt are among others that to search for a profitable solution to a real problem or desire – not to develop tech gadgets disregarding the customer. The following paragraphs list evolving factors to the perfect storm: trend blurring previous distinctions of technology products; the risk of missing new and unorthodox market entrants; how new possibilities emerge to allow lifelong product improvement to stay relevant; and driven by data insight.

Technology trend: device merger

In 1991 most of the companies delivering the products on this Radio Shack advertisement (see picture above) did not anticipate each other eventually to become potential competitors. However, soon after the mobile network absorbed the phone answering machine. Then the tape/CD player was replaced by an iPod successively succumbing to network streaming services, and apparently, the convenience of having a still camera and video recorder available any time in the pocket is a stronger driver than a high quality SLR camera or camcorder.
The iPhone 7 has more than 1000 times more transistors than the original PC Pentium requiring reciprocally less energy per computational transaction. Heat dissipation per bit of information calculated or transmitted is reduced with every new generation of silicon integration which also contributes to diminishing cost of the components and sensors required to collect and compute data.

Business trend: carving out “the secret sauce”

The easy access to almost indefinite computational power poses a threat to the traditional companies with a dedicated purpose and tangible machined solution. New-comers may originate their offer from the abundance of computing virtually ignoring the limitations of the physical world. Hence the existing business owners risk becoming an add-on feature integrated with somebody else’s broader solution, or a mere supplier of goods for an ecosystem where value is extracted from their products.
New market entrants do not necessarily see the existing business vertical as an integrated unit like the current range of well established companies. Newcomers experiment, simplify and carve out what has value or they redefine the service to what customers really want but never got.
We call this trend to “carve out the secret sauce” and you can read more about it in our blog post here.

R&D trend: A product’s life starts when unwrapped and powered on for first time

Traditional product development through a stage-gate model delivers a defined feature list at a predictable quality at a certain date. Following product launch the R&D team ships the maintenance task to an engineering team and thus the development phase is considered completed.
Industrial components are traditionally tailor made with exact memory capacity and processor power to fit exactly the purpose. The slim-fit design can be traced back to a time where developing dedicated electronics, every bit and byte represented a considerable expense. Fully customised solutions are often in smaller production series too expensive compared to mass-produced standard COTS (commercial-off-the-shelf) platform offering versatility and full tool chain for fast development and deployment.
Read more about the R&D trend of continuous updates and improvement in our blog post here.

Data trend: Knowledge from multiple devices – machine learning

Initial data collection does not necessarily require deep insight into computing and cloud functionality. Accumulating data is eventually an asset and its importance stretches beyond the actual business opportunity in the making. Since experimental and standardised data collecting units are commercially available, setting up experimental data connectivity to the network may be arranged with an industrial cloud provider, which means an expensive Data Warehouse is no longer a necessity. Subsequently analytics tools are getting more user friendly for visualising the findings from data stored in the cloud.
A potentially valuable proposition is to search for patterns in the utilisation of a large population of behaviour data, then sharing the benefit of insight to everyone using the service. Manufacturers collecting run-time data systematically across multiple installations can foresee apparently uncorrelated events predicting faults appearing in turn. The capabilities of advanced statistical algorithms applied across massive data sets will reveal dependencies not identified by usual monitoring and reporting.

More information:

CEO, Jakob Appel, jakob.appel@glaze.dk, +45 26 17 18 58