October 11, 2016
In 2014, two tech professionals demonstrated that it was possible to remotely hack into a vehicle. They were able to gain full control of a Jeep Cherokee even with the driver inside. This is a terrifying prospect, and one that leads to questions about where the digital world is heading when a hacker can literally take control of a vehicle away from the person driving it.
It is not always apparent that cars are connected to the internet, but this Jeep, along with many other vehicles, had internet connectivity as part of its entertainment center. That makes it a smart device; it is part of the Internet of Things (IoT). Once the hackers were inside the networked controls of the Jeep, they were able to usurp much of the vehicle’s functionality, from entertainment and climate control to the braking system.
Security for computer systems—and all IoT devices are computers—is a fundamental concern for businesses, consumers, and governments. The reality, however, is that security is not being considered in the design phase of most IoT devices, leaving them vulnerable to remote hacking. The need for certified security by design is a vital issue for the future of the IoT.
“Security by design is, from the start, how you as a systems engineer, architect, or designer are making sure your system will do the ‘right’ things,” says Professor Shiu-Kai Chin, a core faculty member of Syracuse University’s online Master of Science in Cybersecurity. “Certified means ‘prove it’—i.e., someone other than the designers or operators can verify the integrity, security, and trustworthiness of systems and operations.”
The lack of security by design in everyday devices is the result of how modern computing evolved. According to Professor Chin, it only took a generation-and-a-half to “forget how to design in security.” The solution has as much to do with culture as it does with technology.
Lessons from the Evolution of Computer Security
A good starting point is the 1970s, when computers became easier to use. During this time, the military and government were the primary users, so the need for security was paramount and built into the design process. Microcomputers appeared in the late 1970s and were succeeded by the personal computers of the 1980s. These PCs lacked connectivity to other computers, so there was little need for security. The 1990s brought connectivity over the internet, reviving the need for security.
However, the commoditization of computers, which has steadily increased with smartphones and tablets, led to an overarching need for mass, rapid, and cheap production. In this rush to sell products, the attention to security by design fell by the wayside—and the result is the almost daily occurrence of large-scale computer hacks.
Post-millennium, consumers have become more aware of the need for security, but the effects of the PC revolution and the loss of security by design reverberate in modern computing. Anti-malware and security products are more powerful, versatile, and accessible than ever, but they are still insufficient because they are not an intrinsic part of the design process.
The current attitude to security is especially evident in Internet of Things devices. Many internet-enabled products are relatively new to the market, and their mainstream adoption rate is only just beginning to pick up pace. As with the Jeep Cherokee, integral security measures are usually an afterthought at best and are, more often, completely ignored.
The matter is not helped by how much more complex and wide-reaching computers have become. Security is one aspect among many, and market forces often push it down the priority list. For a business designing a new product, the call for security can be easily lost among the clamor to add features and the need to make a profitable, attention-grabbing product.
A secure Internet of Things will require insistence on what Jerome Saltzer and Michael Schroeder wrote nearly 40 years ago: “Every access to every object must be checked for authority … A foolproof method of identifying the source of every request must be devised.”1
Rethinking Design for IoT
The Internet of Things is as big a step forward today as the PC Revolution was in the 1980s. However, inherent internet connectivity and lack of user control invites remote hacking through unforeseen vulnerabilities. Security by design can be brought back into the manufacturing process, but it will require two things: that security is designed into new products and that security by design is verified. The result would be certified security by design.
“Boiled down to their essence, both security by design and certified security by design amount to accountability in the digital age,” says Professor Chin, who proposed the idea in his paper titled “Certified Security by Design for the Internet of Things.”
The proposal combines two requirements into a single set of “rules of engagement” to standardize how security can be incorporated into the design process, and how it can be verified or certified at every step. During the certified security of design (CSBD) process, each step is reported and checked against the policies of the security design. To assure the veracity of the reports, they are examined by a third party, in the manner of auditors, and given documented approval.
Professor Chin likens the process to how we require verification from third parties in other parts of our lives: “If I deliver a hand-written note to you saying ‘here’s my proof,’ that’s equivalent to a CFO saying, ‘Trust me, all is OK with the business.’ We don’t accept this when evaluating the integrity of operations for publically owned businesses. Our business culture requires independent auditors to give their opinions.”
For the audit to be successful and consistent, CSBD employs a mathematical model of logic, providing a clear set of parameters to determine whether any feature of the device has secured authorization or not.
Every action can be traced back to whoever implemented it and who gave authority for the implementation. It highlights who performed any given action, that action’s effects, and whether or not that action was taken with proper permission. This provides a level of transparency that shows who is responsible for every action, with no obfuscation—ensuring that the policy is followed consistently and scrupulously.
Of course, even the certification or verification steps need to be assured. To avoid human error, an “interactive theorem prover,” such as HOL4, can be used. This approach will reduce human error to input errors, handle large and complicated formulas, and most importantly, as Professor Chin wrote, provide “the ability of certifiers, third parties, and people other than the designers to reproduce and check verification results easily and quickly.”
Certified Security by Design in Action
Once the aims and methods of CSBD have been defined, they must be put into practice. Professor Chin’s research on the subject uses a thermostat as a case study—an appropriate example because it is a familiar, easily understood device.
Originally, the thermostat was a purely manual, single-function device, but the Internet of Things has allowed the modern home to be equipped with smart thermostats—network-enabled devices with a wide feature list. Remote hacking of such devices poses great concerns for privacy; a 2014 Black Hat presentationdemonstrated how a Nest smart thermometer could be taken over.
A hypothetical design process to implement CSBD for a smart thermometer would involve three stages: an analysis phase, a planning phase, and an implementation phase. Through all of these stages, the historical function and use of an existing device is used to inform the design decisions for the new one. It starts with establishing the concept of operations, also known as CONOPS.
As defined by the Institute of Electrical and Electronics Engineers, a CONOPS expresses the “characteristics for a proposed system from a user’s perspective. A CONOPS also describes the user organization, mission, and objectives from an integrated systems point of view.” Using CONOPS as a starting point, the certified security by design process has three stages:
1. Analysis Phase
To start, designers consider the intended use of the product and its “operating assumptions”—or how users will interact with it. With a thermostat, there is a sharp contrast between the CONOPS of a traditional thermostat and an IoT thermostat.
A traditional thermostat is a manually operated device that controls temperature. Some may have monitoring functionality built in, but it is very much a non-networked “dumb” device. An IoT thermostat retains this functionality, but adds the option of remote control via a server, meaning anyone with permission can access and control it. This can include utility companies that use access to add efficiency to how they operate electricity grids. It is important that outside users only gain access to the thermostats if they have been authorized to do so.
2. Planning Phase
Juxtaposing the CONOPS of traditional and smart thermostats highlights the areas where new security controls need to be designed. With a traditional thermostat, it’s simple: The owner uses the thermostat’s manual controls to change the temperature. An IoT thermostat is more complicated:
- The owner uses the thermostat’s manual controls to change the temperature.
- The owner uses the thermostat’s network functionality—such as a control app—to change the temperature remotely.
- The utility company uses the thermostat’s network functionality, with the owner’s authorization, to change the temperature remotely.
3. Implementation Phase
Under a certified security by design process, designers would take what they learned in the analysis and planning stages and require the following security assurances:
- Ensure the server can identify the owner, usually through a username and password.
- Ensure the thermostat verifies that the instructions are coming from the correct server.
- Ensure the server verifies the identity of the utility company.
- Ensure the server verifies that the utility company has the correct and relevant permissions to access the server.
- Ensure the server verifies that the utility company’s permissions were correctly received from the owner of the thermostat.
- Ensure that no other entity is given access to the thermostat.
The result of this process is that manufacturers and users understand how their smart devices work—it won’t guarantee flawless operation, but, according to Professor Chin, the rigorous process is incredibly beneficial.
“When things don’t work as expected, people with deep insight know where to look first and what to ignore,” he says. “In a world where success or failure is determined by who makes the best and most appropriate response or adaptation to surprise or changing circumstances, knowing what to pay attention to is a strategic advantage.”
The Way Forward
It will be a challenge to adopt this level of commitment in the design process, but the need for certified security by design goes beyond protecting personal devices.
The military increasingly relies on commoditized off-the-shelf consumer components. It has been suggested that an Israeli air attack on a Syrian nuclear installation succeeded because a “kill-switch” had been built into components used by Syrian defenses. The Israelis simply turned off the Syrian radar system. In addition, critical infrastructure, such as the electricity supply grid, uses both hardware and software designed during the period when security was given scant regard. A series of large-scale electricity outages in the Ukraine in December 2015 is believed to have been caused by Russian hackers.
Such examples will only increase unless security is baked into the design process. Computers have become such a fundamental aspect of every part of our lives that security can no longer afford its current “let people hack it, then patch it” approach. Roger Schell, the “father” of the Trusted Computer System Evaluation Criteria, said in 1979, “It is not easy to make a computer system secure, but neither is it impossible. The greatest error is to ignore the problem.” So far, the problem has been largely ignored; but certified security by design offers a positive way forward.
“Simply put, engineering and computer science support society,” says Professor Chin. “Our job includes making sure that our products are safe, secure, and operate with integrity. Anything less is a dereliction of duty.”
If you are interested in learning more about our M.S. in Cybersecurity, M.S. in Computer Science, or M.S. in Computer Engineering online programs, please request information here.
1. Saltzer, Jerome, and Schroeder, Michael. “The Protection of Information in Computer Systems,” Proceedings of the IEEE, Volume 63, No. 9, September 1975.