We are anticipating an avalanche of programmable, networked devices as the rest of this decade unfolds. It is common for this so-called "Internet of Things" to associate a mobile app with the device that allows it to be controlled or monitored through the public Internet. In the early history of the Internet, this was considered a stunt. In particular, at an Interop show in the 1990s, a remotely controlled toaster was exhibited on the show floor and we all laughed. Not so funny now.
In the rush to produce devices of this kind, manufacturers are cutting corners, grabbing open source software without much consideration for safety and security. They are thinking about the idea of "one device-one app" without giving adequate thought to the interoperability of ensembles of devices from multiple makers. Lacuna can also be felt in coping with configuration of many devices, recognizing risks of network-based attacks, ease of use, and the ability to update software from a valid source.
"Increased reliance on the proper functioning of software should also increase demands for responsible engineering, lest we create a fragile future no one wants"
It is vital that engineers and programmers recognize their responsibilities in this space. Millions will rely on these devices to perform safely, to be resistant to abusive attack or incorporation into botnets and to function even when the Internet isn't accessible. It is therefore irresponsible not to take every effort to assure that this reliance is not misplaced. Just as the famous Underwriters Laboratory has tested products in the past, a new version of that function is needed to increase the likelihood that consumers can trust these devices for safety, security, and privacy protection.
Some of this equipment will be installed in homes and manufacturing plants with the expectation of operation for periods that could be measured in decades. There is no doubt that the associated software will need to be maintained during that time, leaving purchasers to wonder whether the associated companies will still be around to service the systems during the lifetime of the product. Operating systems are updated with some frequency and support for older versions deliberately abandoned for understandable business reasons. Somehow these support issues will have to be or should be addressed in the lifecycle planning for their manufacture and sale.
Software developers will need new tools to help them avoid exploitable bugs or, at least, to discover them before products are released into the wild. Programmers and systems engineers will need to feel empowered by ethical considerations to resist release of products that do not meet standards of safety, reliability, privacy and resilience. Indeed, standards need to be developed to address these issues. In some extreme cases, failure to address these issues may be considered flagrantly irresponsible and lead to penalties, assuming legislation supports this interpretation of responsibility.
Returning to the interoperability theme, it seems inevitable that the uses of these devices, especially in manufacturing plants, office buildings and residences as well as smart cities, will produce pressure for communication standards at all layers in the architecture. The ability to manage and configure devices at scale will be significantly facilitated by adopting common standards which improve the operations and security of the resulting system. There are arguments for diversity to avoid common failure modes and that notion should not be entirely discounted, but maintaining too many variations leads to insecurity and complexity that will not contribute to reliability.
We are entering an era in which software will make decisions for us that once we made for ourselves, whether we are thinking about self-driving cars, robotic manufacturing systems or smart houses and cities. Increased reliance on the proper functioning of software should also increase demands for responsible engineering, lest we create a fragile future no one wants.