Already a Bloomberg.com user?
Sign in with the same account.
One of the greatest advantages of drones—for gathering intelligence, patrolling borders, doing weather research, or killing terrorists—is that they can be piloted by people who are on the ground and far away. They can do dangerous, difficult, tedious tasks without requiring the risk of human lives. For their critics, there is a flip side to this: Drones risk making it too easy to kill without perceived consequences, or spy, or monitor every instant of everyone’s lives.
Now there’s something new to worry about. If we can control our drones at a distance, what’s to ensure that someone else won’t do it, too? How easy would it be for someone to hijack a drone and Svengali-like, get it to do what they wanted, instead of its mission?
Not as hard as one might hope. That’s what a team led by Todd Humphreys, an assistant professor at the University of Texas, Austin, and head of its Radionavigation Laboratory, proved last month. In a test at White Sands Missile Range, Humphreys and his team of graduate students were able to take over the navigation system of a drone from a hilltop a kilometer away in front of an audience of officials from the Department of Homeland Security, the U.S. Air Force, the Federal Aviation Administration, and others. The device they had built to do this consisted of about $1,000 worth of off-the-shelf parts.
The technique they used is called GPS spoofing. Drones rely on positional data in everything they do, and Humphreys’s device worked by feeding the drones false coordinates. When pointed in a drone’s direction, the spoofing box is able to extrapolate the drone’s GPS coordinates. Then it sends out a signal aligned in such a way that the drone’s GPS receiver starts relying on the box instead of the signal coming from actual GPS satellites. That false positional information can then be altered at will. “It’s like in a car: If you tell someone they’re further along than they are and they have to drive by your instructions with their eyes closed, they’re going to turn into the wrong driveway,” Humphreys says.
At the White Sands demonstration, the team convinced the drone that it was rising off the ground too fast; in response it dropped like a stone, thinking it was compensating when in fact it was about to plow into the ground. (A back-up human pilot interceded a few feet before an actual crash). “You take over its sense of its own position and the autopilot does the rest,” Humphreys says.
Humphreys emphasizes that what he and his team did isn’t easy—it took them four years to develop and fine-tune the necessary software. “It’s not within the capability of your average Anonymous hacker,” he says. However, it is perhaps not beyond the capability of state actors or sophisticated terrorist groups—Iran proudly claimed it brought down a U.S. drone last December by spoofing its navigation system.
The problem, Humphrey argues, is that, unlike the military GPS system, the civilian one isn’t encrypted. Not only that, there’s nothing in the signals beamed down from GPS satellites that allows a receiver to tell whether a signal is real or fake. It’s as if dollar bills were just scraps of notepad paper with “one dollar” written on them in ballpoint pen—detecting counterfeits would be impossible.
What should we do? When Humphreys testifies before Congress this month, he’ll recommend altering the signals GPS satellites send so that they have a sort of digital watermark. That would allow receivers to distinguish fake signals from real ones, as watermarks in paper currency do. But while this is technologically straightforward, Humphreys says, politically it is not. Because of the mismatch of responsibilities and funding at the federal level, “My understanding is that we have to persuade the House Oversight Committee to persuade the Department of Homeland Security to fund the Department of Defense to make changes in the signals.” Even in a best-case scenario, that puts any changes several years in the future.
Meanwhile, he’d like to see requirements that drones above a certain size be required to have spoof-proof GPS receivers. Even without a digital watermark, there are subtle ways that false GPS signals give themselves away. “It’s not perfect, but even the military encryption isn’t perfect,” he says.
Humphreys has been sounding the alarm about the dangers of GPS spoofing for years. It’s not just a problem for drones: A growing number of technologies rely on GPS data, from programs that trade stocks to air-traffic monitoring. Last December, he performed an attack for the Department of Energy on the sensors that monitor electricity in real time across smart electricity grids.
But his drone-spoofing has gotten far more attention for the cause, he says. There’s something about things falling out of the sky that gets people’s attention. The FAA has promised to open the U.S. skies to civilian drones by 2015 and has estimated that by 2020, there could be 30,000 of them aloft. That’s a lot of potential flying zombies.