The Internet of Things is dead. Or so declared one of my colleagues recently. While I could dismiss the comment as flippant, it does point to an underlying cynicism of technology that has been nicely captured by Gartner’s eponymous Hype Cycle. As technologists we often focus on cool technologies themselves, and then get frustrated, or become dismissive, when the business rationale or required ecosystem lags in maturity to make deployment of these technologies viable, or they are simply too complex or take more time to do so.

Whether you believe that IoT is just hype, that big data stumbled out of the gate, that Industry 4.0 will never happen, or that AI is just a fad, the recurring theme that weaves each of these narratives together is the relentless drive towards digital transformation seen across industries. One could accuse digital transformation itself as being a hyped buzzword, though its longevity as a business theme points to some underlying needs which are yet to be fully realized. At its core, digital transformation represents a rethinking of how enterprises use (digital) technology to radically change performance.

In recent conversations with customers, digital transformation isn’t usually addressed head-on under that banner. More often, the topic arises tangentially due to some other problem we’re exploring, though there are three recurring themes that point to the need to change the way that embedded systems are developed.

  1. Fixed function to flexible systems

Many embedded or control systems are designed in a monolithic manner; custom hardware is outfitted with a tailored operating system, likely some complex middleware, and hosting one or more applications to perform a specific set of tasks. The entire device is packaged and sold as a single device, and an upgrade is performed by replacing a whole unit with a newer generation that has undergone a similar design cycle. Not only is this a cumbersome design approach requiring re-development and re-testing of a number of non-differentiating components each design cycle, but it is also inflexible when it comes to deploying new features or fixing broken ones (including security updates). Contrast this with the modern approach to enterprise or cloud software development where applications (or increasingly micro-services) have been developed independently of how or where they will be deployed, accelerating innovation and time-to-value.

  1. Automated devices to autonomous systems

Many embedded systems are designed to automate specific tasks. In industrial systems, for example, a Programmable Logic Controller (PLC) is used to automate manufacturing processes such as chemical reactions, assembly lines or robotic devices. Generally, these devices perform with a high degree of accuracy, repeatability and reliability, though they need to be individually programmed to do so and often have little scope for performing outside of their initial design parameters. However, in order to drive productivity increases and impact larger business outcomes, learning systems will increasingly be used spanning a range of control devices at the cell, plant, or system level. Similar system-level approaches are emerging in autonomous driving applications, where information from multiple subsystems needs to be merged and processed in some central unit running machine learning algorithms for object classification, path-finding and actuation.

Learning systems will also have a big impact on the type of computing workloads that need to be run on edge devices. Traditionally, embedded system design has begun with custom hardware, possibly encompassing customized silicon processors, on which software is layered – a “bottoms-up” approach. For machine learning implementations, the process is turned on its head; a defined problem statement will determine the best type of learning algorithm to use (for example an object classification problem may require a different approach to voice recognition), from which the best hardware platform will be selected to run the learning framework most efficiently. This may involve selecting CPUs with specific instruction sets or accelerators, or using GPUs or FPGAs alongside traditional processors, for example. In these environments, the software often defines the required hardware platform.

  1. Software-defined everything

The advent of autonomous systems will require a shift in system design focus from individual, resource-constrained, bespoke devices, to more flexible and programmable environments that can be changed or optimized more globally. This shift will not only impact the engineering approach to architecting intelligent systems, but also the supply chain which has long been established in various industries around the production of specific, functional “black boxes”, such as Electronic Control Units (ECUs) in automotive, or Distributed Control Systems (DCS) in industrial applications.

Similarly, the skill set required to build these systems will evolve to encompass a much more software-centric aspect. Companies who may have defined their differentiation and captured their value by designing and selling hardware, will likely find that they need to develop a rich software competency. This will involve defining a software blueprint, and possibly tools, APIs and SDKs with which their ecosystem will deliver additional value-add components to an underlying computing platform. The responsibility for integrating middleware or applications from a number of suppliers could shift from the supply chain to the equipment manufacturers themselves, and bring with it a change in the support or liability models.

Modernizing the development paradigm: the IT journey
Enterprise IT systems have undergone a radical transformation in the last couple of decades. At the beginning of my career, I recall using not only mainframe computers, but a plethora of microcomputers, each with their own flavor of operating system. Look under the hood, and you’d find that these computers were powered by unique, and sometimes custom, processor architectures. As desktop PCs and servers emerged, Intel Architecture became the ubiquitous silicon architecture for enterprise IT systems, driving standardization of hardware, development tools, and a vibrant software ecosystem.

Next, we saw the transformative power of virtualization, which led to consolidation of applications and a drive to higher hardware utilization rates, squeezing yet more efficiency into the IT landscape. While the motivation was initially driven by optimization of local computing resources, de-coupling software from the underlying hardware allowed centralization of computing resources and paved the way for cloud computing.

Today, cloud computing has removed the dependency between hardware and software, and applications or individual functions can be written quickly and efficiently, while having great control of underlying computing, storage and networking resources. This decoupling has allowed developers to quickly develop, deploy and update applications and enormous scale, without worrying about purchasing or managing any hardware at all.

While IT developers can quickly build and deploy hyperscale applications, build upon the knowledge of others by using rich application frameworks, modern development languages and tools, and use infrastructure that is managed by someone else, embedded developers mostly do not have this luxury. Instead, their development model leaves them struggling to stay apace of rapidly changing silicon architectures, unable to use a lot of the advances in software development and deployment methodology that their IT counterparts enjoy, and as a result struggle with rapid innovation, affordability of their systems, and product obsolescence. They are not enjoying the advances brought about by the IT journey.

In order to change this, one must recognize that embedded systems often are very different from IT systems. Issues such as system performance and reliability, costs, resource and timing constraints, intolerance of failures or downtime, safety needs, all place very specific requirements on how systems are built and deployed. However, by recognizing and addressing these requirements, I believe we can start leveraging the advances seen in the IT domain and unlock more efficiency, innovation and affordability in the way embedded systems are built.