Technology's Creeping Threats to Privacy


By Heather Green The U.S., for better and worse, is open to novel technology and quick to adopt it. One downside of our fascination with the new is that we're often so busy adapting to change, we don't really realize how much technology is whittling away at our expectation of privacy. Sure, we're shocked when we learn technology can track every move we make on the Internet. But that we can still be surprised by such a revelation shows just how antiquated our notion of offline privacy really is.

We still feel fairly comfortable with the idea that our privacy when we're not online is protected, because it seems impossible to monitor a person's actions all the time. It's a false sense of security. Bit by bit, our privacy is being eaten away as society becomes more digital -- through highway toll-collection services like E-ZPass, subway cards, and law-enforcement security programs.

Our increasingly networked world has big problems. As more information is collected, it can be easily stored and aggregated. That leads to a "creep factor." When information is collected or technology is accepted for one use, it can then easily be exploited in another way down the line. Information from a highway toll-payment system could be interesting to car insurance companies.

COLLECTING KEYSTROKES. A couple of recent incidents provide food for thought. A federal judge in Newark, N.J., is considering defense motions to dismiss evidence collected by the FBI using supersecret technology that records computer keystrokes.

The defendant is hardly your typical privacy zealot. Nicodemo Scarfo Jr., the son of a jailed Mob boss, is accused of loansharking and illegal gambling. The FBI received a search warrant to plant technology on Scarfo's computer to capture a password it needed to decrypt a file collected in a previous search of Scarfo's business offices. Scarfo's lawyer and privacy advocates contend that without an understanding of the technology, it's hard to determine whether the FBI overstepped limits on wiretapping or violated Fourth Amendment rights.

The disturbing part here is that the FBI doesn't want to give away any information about the technology it planted on Scarfo's computer. In its response to the defense motions, the FBI invoked national-security issues, saying the technology, called a key logger system, is so highly classified that fewer than 30 people in the 27,000-person bureau actually know how it works. On Aug. 7, U.S. District Judge Nicholas Politan gave the FBI until Aug. 31 to provide a description of key logger's operation. The parties in the case are under court order not to talk about the motions.

DANGEROUS PRECEDENT. Now, given the crimes Scarfo is accused of, some people would say the ends justify the means. That's the wrong approach. If the government devised technology that is essentially a wiretap -- even though it doesn't fit the traditional definition of a wiretap -- then acceptance of this kind of surveillance sets a dangerous precedent and opens the door to more egregious privacy violations in the future.

Why should the new technology be allowed to circumvent wiretap limits? In its response to the defense's motions, the government said the key logger isn't a wiretap because it didn't intercept any transmission of information from Scarfo's computer via a modem to someone on a network. However, if the key logger captured information as it was being sent to the modem, it could be considered a wiretap.

Why the concern? Warrants to use wiretaps are much more difficult to obtain and require more stringent filters on information collected. If the FBI decides not to give further details even to the judge, it will be hard to know just how invasive the technology is and when it is being used in the future. It would be difficult to track new innovations of this kind and impossible to know when they're being used in cases that are a less egregious than alleged loansharking and illegal gambling.

FACES IN THE CROWD. A development further South could also lead to creeping privacy invasion. On Aug. 2, the Tampa (Fla.) city council voted against a motion to end its contract with Visionics, a company that provides face-recognition software. Tampa first used similar technology in January during the Super Bowl, when it deployed cameras and sophisticated software to scan faces in the crowds and compare them with a database of digital mug shots of criminals. The city later installed the cameras in its nightlife district.

Such technologies are being adopted in the name of public safety. The Tampa police are using the face-recognition technology to track criminals and runaways. If a match isn't found from within Tampa's database, the image is erased. But the use could easily be expanded.

Already, some motor vehicle departments and credit-card companies digitize photos, which could easily be linked to face-recognition systems. The DMVs in Colorado and Washington, D.C., are installing or considering using face-scanning systems to prevent identity theft. If deployment of the technology becomes more widespread, it could simply be used to track innocent citizens who aren't under suspicion.

Instead of being erased, the photos taken of crowds could be stored, without citizens even being aware they were being watched. What's to stop commercial companies from acquiring these public records and selling them to private detectives or divorce lawyers?

TIME FOR DEBATE. We don't know what limits to put on these technologies. We haven't begun to grasp how much technological developments over the past 10 years have subtly changed what we consider an acceptable trade-off between safety and privacy. But if we look back just a decade, the changes are clear.

Ten years ago, Scarfo would have kept his alleged gambling and loanshark accounts in a ledger in a file cabinet. The police would have obtained a search warrant and found only that information. Most likely, Scarfo or someone else would have been in the offices to witness the search and seizure. Now, the court is faced with trying to figure out the legality of technology the FBI refuses to describe and decide on the future implications of allowing the use of such secret snooping innovations.

When Tampa first used the face-recognition technology during the Super Bowl, the attendees weren't notified they were under constant, centralized surveillance. It's something you would never have even imagined. But you could easily argue that the technology isn't that far removed from the stoplight cameras now prevalent everywhere.

It's time for a debate about what the limits of this kind of surveillance should be. We need to focus on what is and isn't acceptable when it comes to face-recognition cameras. We also need to reassess our understanding of search-and-seizure laws in a digital society. So much of this technology starts out as a limited public-safety implementation. Without limits established now, it's impossible to believe creep won't happen. Green covers the Internet for BusinessWeek in New York


Burger King's Young Buns
LIMITED-TIME OFFER SUBSCRIBE NOW

(enter your email)
(enter up to 5 email addresses, separated by commas)

Max 250 characters

 
blog comments powered by Disqus