Freedom to Tinker Freedom to Tinker

Research and expert commentary on digital technologies in public life
  • August 22, 2014   Published ~ 10 years ago.

    Airport Scanners: How Privacy Risk Leads to Security Risk

    Debates about privacy and security tend to assume that the two are in opposition, so that improving privacy tends to degrade security, and vice versa. But often the two go hand in hand so that privacy enhances security. A good example comes from the airport scanner study I wrote about yesterday. One of the failure […]

  • August 21, 2014   Published ~ 10 years ago.

    Researchers Show Flaws in Airport Scanner

    Today at the Usenix Security Symposium a group of researchers from UC San Diego and the University of Michigan will present a paper demonstrating flaws in a full-body scaning machine that was used at many U.S. airports. In this post I’ll summarize their findings and discuss the security and policy implications. (The researchers offer a […]

  • August 13, 2014   Published ~ 10 years ago.

    The End of a Brief Era: Recent Appellate Decisions in “Copyright Troll” Litigation

    The onslaught of “copyright troll” litigation began only a few years ago, with lawsuits implicating hundreds or even thousands of “John Doe” defendants, who were identified by IP addresses with timestamps corresponding to alleged uses of BitTorrent services to share and download video content without authorization. Recently, federal appellate opinions confirmed growing consensus in district […]

  • August 8, 2014   Published ~ 10 years ago.

    Princeton likely to rescind grade deflation policy

    A Princeton faculty committee recommended yesterday that the university rescind its ten-year-old grading guideline that advises faculty to assign grades in the A range to at most 35% of students. The committee issued a report explaining its rationale. The recommendation will probably be accepted and implemented. It’s a good report, and I agree with its […]

  • August 7, 2014   Published ~ 10 years ago.

    Criminal Copyright Sanctions as a U.S. Export

    The copyright industries’ mantra that “digital is different” has driven an aggressive, global expansion in criminal sanctions for copyright infringement over the last two decades. Historically speaking, criminal penalties for copyright infringement under U.S. law date from the turn of the 20th century, which means that for over a hundred years (from 1790 to 1897), […]

  • August 7, 2014   Published ~ 10 years ago.

    The hidden perils of cookie syncing

    [Steven Englehardt is a first-year Ph.D. student in the computer security group at Princeton. In this post he talks about the implications of a recent study that we published in collaboration with researchers at KU Leuven, Belgium. — Arvind Narayanan] Online tracking is becoming more sophisticated and thus increasingly difficult to block. Modern browsers expose many surfaces that enable users […]

  • July 31, 2014   Published ~ 10 years ago.

    Why were CERT researchers attacking Tor?

    Yesterday the Tor Project issued an advisory describing a large-scale identification attack on Tor hidden services. The attack started on January 30 and ended when Tor ejected the attackers on July 4. It appears that this attack was the subject of a Black Hat talk that was canceled abruptly. These attacks raise serious questions about […]

  • July 30, 2014   Published ~ 10 years ago.

    Are We Rushing to Judgment Against the Hidden Power of Algorithms?

    Several recent news stories have highlighted the ways that online social platforms can subtly shape our lives. First came the news that Facebook has “manipulated” users’ emotions by tweaking the balance of happy and sad posts that it shows to some users. Then, this week, the popular online dating service OKCupid announced that it had […]

  • July 16, 2014   Published ~ 10 years ago.

    A Scanner Darkly: Protecting User Privacy from Perceptual Applications

    “A Scanner Darkly”, a dystopian 1977 Philip K. Dick novel (adapted to a 2006 film), describes a society with pervasive audio and video surveillance. Our paper “A Scanner Darkly”, which appeared in last year’s IEEE Symposium on Security and Privacy (Oakland) and has just received the 2014 PET Award for Outstanding Research in Privacy Enhancing Technologies, takes a closer look at the soon-to-come world where ubiquitous surveillance is performed not by the drug police but by everyday devices with high-bandwidth sensors. A scanner darkly The age of perceptual computing is upon us. Mobile phones and laptops, programmable robots such as iRobot Create, gaming devices such as Microsoft Kinect, and augmented reality displays such as Google Glass are built around cameras and microphones, enabling software apps to “see” their physical environment. Some of these apps are mundane – motion detectors, enhanced video-chat programs, ball-chaser apps for robotic dogs – yet others are Star Trek stuff: natural user interfaces that react to gestures and sounds, sophisticated face recognizers, even room-wide, ambient, context-aware systems.

    Starship Enterprise was not running untrusted apps, though. Modern perceptual computing platforms do. Mobile and robot operating systems, Kinect, Google Glass all encourage independent developers to create software for their respective app stores. What could possibly go wrong? The security and privacy risks of a malicious or buggy app with unlimited camera and microphone access are obvious. And yes, some of these devices are capable of moving around on their own: think a third-party app turning a robotic pet into a roving spy camera. Robot dog First, consider overcollection of data. Many perceptual apps (e.g., augmented-reality browsers) transmit the entire camera feed to the server for image recognition. It’s nice of them to be cognizant of users’ electric bills, but streaming high-def visuals of users’ rooms to a random app provider may have unpleasant privacy implications. What types of objects do they recognize? How? What if the camera accidentally captured a face, a computer screen, a drug label, a credit card, a license plate? What happens to images afterwards — are they kept somewhere intentionally or accidentally?

    Second, continuous aggregation of information (for example, a security app surveilling the same room for hours at a time) raises entirely new categories of privacy risks. Even high-level, abstract information about visual scenes is risky when aggregated over time.  Consider a “skeleton recognizer” app that detects the presence of a person but cannot see faces, individual items, etc. Even such a restricted app can infer that there are two individuals in the room, observe their movement and proximity patterns, etc. [*] Skeleton recognizer Darkly is our first attempt to map out the road towards a new field of systems research: privacy-preserving perceptual computing. Darkly is a multi-layered, domain-specific (this is unusual!) privacy protection system that leverages the structure of perceptual software to insert protection at the platform level. Our key observation is that virtually all applications access perceptual sensors through the platform’s abstract API. This is not surprising. It is cumbersome for a developer of a computer vision application to code algorithms such as motion detection when operating on raw pixels; much easier to employ vision libraries such as OpenCV. Similarly, Microsoft’s Kinect SDK provides library functions for detecting the outline of a human body to make it easier to write gesture-controlled applications.

    These APIs are exactly where Darkly intercepts the app’s sensor accesses and inserts multiple privacy protection layers. Darkly runs as part of the platform and thus at a higher privilege level than the untrusted apps (a rough analogy is system call interposition in operating systems). This helps make privacy protection transparent and requires no changes to the apps’ code. Our Darkly prototype is integrated with OpenCV, a popular computer vision library that runs on many mobile and robot platforms. Darkly architecture The first layer of protection in Darkly is access control. Darkly replaces pointers to raw pixel data with opaque references that have the same format but cannot be dereferenced by applications. OpenCV functions dereference them internally and thus operate on raw pixels without any loss of fidelity. It turns out that most of our benchmark OpenCV applications still work correctly without any modifications because they never access raw pixels, they just pass pixel pointers back and forth to OpenCV library functions (this is testament to the richness of functionality supported by OpenCV). Darkly also provides trusted GUI and storage APIs that allow an app – for example, a remotely operated security cam – to display captured images to the user, operate on user input, and store images without being able to read them.

    Some apps are only interested in certain image features: for example, a security cam may need object contours to detect movement, while a QR code scanner needs the black-and-white matrix. For these features, algorithmic transforms remove individual features, leaving only simple shapes. As mentioned above, this may not prevent inferential privacy breaches!

    Finally, some apps – for example, eigenface-based face recognizers – do need access to raw pixels. For such apps, Darkly provides a special ibc language and a runtime sandbox for privacy-preserving image processing. ibc is based on GNU bc and is an almost pure computation language, with no access to system calls, network, system time, etc. Consequently, it is easy to sandbox, yet can be used to implement many image processing algorithms. ibc programs are allowed to return a single 32-bit value from the sandbox, reducing the risk of accidental data overcollection.

    The final layer of protection in Darkly is user audit. Darkly visually shows to the user what the app “sees,” now and over time. For algorithmic transforms, the privacy dial lets the user set the degree of transformation on the scale from 0 to 11, thus changing the amount of information released to the app. Output of sketching transform There’s more in the paper: quantifying the trade-offs between privacy and utility, a few ideas on privacy-preserving transforms, explanations of why generic privacy technologies are unlikely to solve the problem, and so on. We plan to release the source code of Darkly soon.

    We hope that our work on Darkly will motivate the designers of perceptual computing platforms to start providing built-in privacy protection mechanisms as the integral part of their APIs, SDKs, and OSes. From a research perspective, we also hope that Darkly will generate interest in the nascent field of privacy-preserving perceptual computing. There are numerous interesting research  problems that need to solved here: designing new privacy transforms, metrics for measuring their effectiveness, visualization of other sensor data (e.g., microphone) for user audit, etc. Please get in touch with us if you are interested in any of these problems.

    [*] While Darkly addresses many types of privacy problems associated with perceptual applications, this type of inference isn’t one of them.


  • July 11, 2014   Published ~ 10 years ago.

    “Loopholes for Circumventing the Constitution”, the NSA Statement, and Our Response

    CBS News and a host of other outlets have covered my new paper with Sharon Goldberg, Loopholes for Circumventing the Constitution: Warrantless Bulk Surveillance on Americans by Collecting Network Traffic Abroad. We’ll present the paper on July 18 at HotPETS [slides, pdf], right after a keynote by Bill Binney (the NSA whistleblower), and at TPRC in September. Meanwhile, the NSA has responded to our paper in a clever way that avoids addressing what our paper is actually about.

    In the paper, we reveal known and new legal and technical loopholes that enable internet traffic shaping by intelligence authorities to circumvent constitutional safeguards for Americans. The paper is in some ways a classic exercise in threat modeling, but what’s rather new is our combination of descriptive legal analysis with methods from computer science. Thus, we’re able to identify interdependent legal and technical loopholes, mostly in internet routing. We’ll definitely be pursuing similar projects in the future and hope we get other folks to adopt such multidisciplinary methods too.

    As to the media coverage, the CBS News piece contains some outstanding reporting and an official NSA statement that seeks – but fails – to debunk our analysis:

    However, an NSA spokesperson denied that either EO 12333 or USSID 18 “authorizes targeting of U.S. persons for electronic surveillance by routing their communications outside of the U.S.,” in an emailed statement to CBS News.

    “Absent limited exception (for example, in an emergency), the Foreign Intelligence Surveillance Act requires that we get a court order to target any U.S. person anywhere in the world for electronic surveillance. In order to get such an order, we have to establish, to the satisfaction of a federal judge, probable cause to believe that the U.S. person is an agent of a foreign power,” the spokesperson said.

    The NSA statement sidetracks our analysis by re-framing the issue to construct a legal situation that conveniently evades the main argument of our paper. Notice how the NSA concentrates on the legality of targeting U.S. persons, while we argue that these loopholes exist when i) surveillance is conducted abroad and ii) when the authorities do not “intentionally target a U.S. person.” The NSA statement, however, only talks about situations in which U.S. persons are “targeted” in the legal sense.

    As we describe at length in our paper, there are several situations in which authorities don’t intentionally target a U.S. person according to the legal definition, but the internet traffic of many Americans can in fact be affected. The best evidence of that point came a few days after we released our paper, in a Washington Post piece that sources original NSA documents on presumed foreignness – confirming exactly what we outline in our paper. Concrete examples include untargeted bulk surveillance (for instance based on non-personal “selectors” or search terms) and the fact that data collected abroad may be presumed foreign. Another clear-cut example is conducting surveillance for a particular policy objective, such as “cybersecurity”.

    In addition, data on Americans may be retained and further processed when it was “incidentally” or “inadvertently” collected through surveillance that did not have the goal of “targeting a U.S. person” in the legal sense. Quoting the recent Washington Post piece:

    Nine of 10 account holders found in a large cache of intercepted conversations, which former NSA contractor Edward Snowden provided in full to The Post, were not the intended surveillance targets but were caught in a net the agency had cast for somebody else.

    This issue has already received a lot of attention over the last months, but this high percentage is new: the personal information of all these account holders may be collected and retained, even though the surveillance operation was not intentionally targeting a U.S. person according to the legal definition. As so often happens in law, legal speak in the books may obscure what really is going on on the ground.

    Another point to emphasize is that those “limited exceptions (for example, an emergency)” from the NSA statement are outlined in USSID 18 section 4.1, and in fact span four heavily redacted pages. It’s quite impossible to tell what lies beneath those redactions – beginning on page 11 of our paper, we make a start and highlight what passages are particularly important to de-classify or include in FOIA requests. In any event, it’s quite a stretch to brand four full pages of exceptions – which add up to dozens of actual situations – as “limited”.

    Bruce Schneier’s blogpost is also worth reading. The expert discussion below his post really captures what blogging is all about.

    Our paper is still a work in progress. In addition to adding recently disclosed information (such as Greenwald’s book and the Washington Post piece), we’ll spend more time analyzing the solutions at hand – from technical, policy, and legal perspectives. The Guardian reports that the U.S. Government’s Privacy and Civil Liberties Oversight Board (PCLOB) will decide on July 23rd whether it will review EO 12333; hopefully the PCLOB will take note of our work so far. In any event, your comments here or by dropping us an email are more than appreciated.


Feed Information for Freedom to Tinker

Find or add a new feed:

Enter website or RSS feed URL:
Upload/import OPML file:

Bookmarklet

Drag this link to your browser bookmarks bar, then click it whenever you want to add the site you're viewing to RSS2.com.

Add to RSS2.com