Accountability and Identifiability

This material is based on collaborative research supported by the NSF Trustworthy Computing program through grants CNS-1016875 and CNS-1018557. (Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.)


The World Wide Web and other networked information systems provide enormous benefits by enabling access to unprecedented amounts of information. However, for many years, users have been frustrated by the fact that these systems also create significant problems. Sensitive personal data are disclosed, confidential corporate data are stolen, copyrights are infringed, and databases owned by one government organization are accessed by members of another in violation of government policy. The frequency of such incidents continues to increase, and an incident must now be truly outrageous to be considered newsworthy. This project takes the view that when security violations occur, it should be possible to punish the violators in some fashion.

Although "accountability" is widely agreed to be important and desirable, there has been little theoretical work on the subject; indeed, there does not even seem to be a standard definition of "accountability," and researchers in different areas use it to mean different things. This project addresses these issues, the relationship between accountability and other goals (such as user privacy), and the requirements (such as identifiability of violators and violations) for accountability in real-world systems. This clarification of the important notion of accountability will help propel a next generation of network-mediated interaction and services that users understand and trust.

The project's technical approach to accountability as an essential component of trustworthiness involves two intertwined research thrusts. The first thrust focuses on definitions and foundational theory. Intuitively, accountability is present in any system in which actions are governed by well defined rules, and violations of those rules are punished. Project goals are to identify ambiguities and gaps in this intuitive notion, provide formal definitions that capture important accountability desiderata, and explicate relationships of accountability to well studied notions such as identifiability, authentication, authorization, privacy, and anonymity. The second thrust focuses on analysis, design, and abstraction. The project studies fundamental accountability and identifiability requirements in real-world systems, both technological and social. One project goal is to use the resulting better understanding of the extent to which accountability is truly at odds with privacy and other desirable system properties to design new protocols with provable accountability properties. Building on that understanding and insights gained in designing protocols, the project also addresses fundamental tradeoffs and impossibility results about accountability and identifiability in various settings.


Senior Personnel

Graduate Students

Undergraduate Students


Sarah Cortes, Debayan Gupta, Lila Ghemri, Jim Hendler, Aaron Johnson, Michael Mitzenmacher, Aurojit Panda, Jennifer Rexford, Michael Schapira, Aaron Segal, Gil Segev, Scott Shenker, Paul Syverson Danny Weitzner, Georgios Zervas


  1. Aaron D. Jaggard, Aaron Johnson, Sarah Cortes, Paul Syverson, and Joan Feigenbaum, "20,000 In League Under the Sea: Anonymous Communication, Trust, MLATs, and Undersea Cables"
  2. Aaron D. Jaggard, Aaron Johnson, Paul Syverson, and Joan Feigenbaum, "Representing Network Trust and Using It to Improve Anonymous Communication"
  3. Joan Feigenbaum, Aaron D. Jaggard, and Rebecca N. Wright, "Open vs. Closed Systems for Accountability"
  4. Aaron D. Jaggard and Rebecca N. Wright, "Strange Bedfellows: How and When to Work with Your Enemy"
  5. Joan Feigenbaum, Aaron D. Jaggard, and Michael Schapira, "Approximate Privacy: Foundations and Quantification"
  6. Joan Feigenbaum, Aaron D. Jaggard, and Rebecca N. Wright, "Accountability as an Interface between Cybersecurity and Social Science"
  7. Debayan Gupta, Aaron Segal, Aurojit Panda, Gil Segev, Michael Schapira, Joan Feigenbaum, Jennifer Rexford, and Scott Shenker, "A New Approach to Interdomain Routing Based on Secure Multi-Party Computation"
  8. Joan Feigenbaum, Michael Mitzenmacher, and Georgios Zervas, "An Economic Analysis of User-Privacy Options in Ad-Supported Service"
  9. Joan Feigenbaum, Aaron D. Jaggard, Rebecca N. Wright, and Hongda Xiao, "Systematizing 'Accountability' in Computer Science"
  10. Joan Feigenbaum, Aaron D. Jaggard, and Rebecca N. Wright, "Towards a Formal Model of Accountability"
  11. Joan Feigenbaum, James Hendler, Aaron D. Jaggard, Daniel Weitzner, and Rebecca N. Wright, "Accountability and Deterrence in Online Life (Extended Abstract)''
  12. Joan Feigenbaum, "Accountability as a Driver of Innovative Privacy Solutions", in Privacy and Innovation Symposium Thought Pieces, Yale Law School Information Society Project, October 2010.


  1. "Accountability, Deterrence, and Identifiability," presented at the DIMACS/BIC/A4Cloud/CSA International Workshop on Trustworthiness, Accountability and Forensics in the Cloud (TAFC), 6 June 2013.
  2. "Privacy and Accountability in the Information Society," PAIS'12 keynote talk by Rebecca Wright, 30 March 2012.
  3. "Toward a Clearer Understanding of Accountability," presented at the NY Area Security and Privacy Day, 10 December 2010. (.pptx slides)


  1. "Accountability and Identifiability," presented at the SaTC PI meeting, 27-29 November 2012. [pdf poster]


Sunday, June 16, 2013 at 00:03