GrammaTalk

How Software Quality Threatens the Success of the Internet of Things

Posted on

by

The Nest is a highly innovative, electronic smart home thermostat manufactured by Alphabet (formerly Google). It is by all accounts a well-designed device, replacing a traditional home thermostat and improving on it enormously by learning the habits of its users and adapting its settings in order to conserve energy. Its motion detectors detect when no one is at home. It connects to the cloud so you can check in and make adjustments from the comfort of your sofa, or from the other side of the world.

Many users gush about how much they love the Nest. It’s a great example of how a well-designed Internet of Things device can make a huge difference in improving the quality of life of its consumer.

Unfortunately, it’s also a great example of how faulty software has the power to undermine the success of IoT devices. Last month, two problems were reported with the Nest.

The first problem reared its ugly head in early January, just as temperatures were plunging throughout the Northeast US. Many devices simply stopped working, prompting hordes of angry customers to vent furiously on social media (see Nest Thermostat Glitch Leaves Users in the Cold by Nick Bilton). The problem was traced to a bug introduced in a software update in December, but which had not surfaced until a couple of weeks later. A workaround was found for some, but others were forced to physically disconnect the device, then charge it with a USB cable before plugging it back in. The solution was particularly inconvenient (if not downright problematic) for those away from their home. 

The second issue was less problematic immediately, but represented a large user concern, illustrating a serious risk that all IoT devices are potentially vulnerable to – privacy. The issue was uncovered by a team of researchers at Princeton, who presented a paper at PrivacyCon 2016 (see The Internet of Unpatched Things by Sarthak Grover and Nick Feamster). They had purchased a number of home-automation devices, including the Nest, in order to study whether they had privacy problems. The conclusions were shocking – by listening to the network traffic, they found that most of the home automation devices leaked user information, and sometimes even user activity and behavior. One of the worst offenders was a Sharx Security IP camera, through which images were sent to a cloud server without any attempt at encryption. The Nest thermostat was identified as having good security properties, especially when compared to its peers, but it did nonetheless have a small amount of information leakage.

To their credit, the manufacturers of the Nest thermostat plugged the leak very rapidly. Unfortunately for them, the message sent to consumers as a result of the Princeton study is that IoT devices are extremely susceptible to privacy issues and likely to be buggy, insecure, and untrustworthy. How do we know what devices can be trusted?

This illustrates a very serious problem facing the IoT industry, at least for consumer devices. The rules of the game for software development have changed, especially in contrast with desktop or web applications. Even small defects can have huge consequences, and it is no longer feasible to rely on a rapid fix-and-patch cycle to address problems. Device reliability – and corporate liability – is now at the mercy of the software it employs. The pressure now rests on software developers to ensure that the software they deliver is as bug-free as possible.

In the case of the first problem with the Nest thermostat, it isn’t clear what underlying software weakness caused the device to fail. A crashing bug like a null-pointer dereference or a buffer overrun could force such a device into a power-hungry crash-reboot cycle that would drain the battery rapidly, but there are dozens of other classes of bugs that could lead to the same outcome. If it were a generic defect, it is likely that an out-of-the-box configuration of a static analysis tool such as CodeSonar would have found it. If it were a domain-specific defect, then some additional configuration or a custom check could also have found it. The key point is that the manufacturers had no test cases that successfully detected the bug.

The strength of static analysis is that it can explore all possible ways in which software can execute, so it can (and regularly does) find bugs that are missed by conventional dynamic testing.

The security vulnerabilities found in the IoT devices tested by the Princeton group were all violations of a key principle of secure programming — that sensitive information should be protected while in transit. It is therefore hard to avoid the conclusion that some of those manufacturers weren’t concerned about the privacy of the users, and I expect that I would be unsuccessful at persuading them that they should use static analysis to find and fix those problems.

But for manufacturers who do care about privacy, static analysis can be used to help a programmer understand how information flows through the program. CodeSonar’s hazardous information flow analysis (aka taint analysis) can identify flows that result in sensitive information being leaked inadvertently.

Tools that find bugs automatically are your friend. Yes, there is a cost to acquire and use them, but it is more than worth the investment. Protecting user privacy in the IoT era keeps your devices trustworthy, and protects the reliability of your brand.

 

Related Posts

Check out all of GrammaTech’s resources and stay informed.

view all posts

Contact Us

Get a personally guided tour of our solution offerings. 

Contact US