DIMACS Workshop on Complexity and Cryptography: Status of Impagliazzo's Worlds

June 3 - 5, 2009
Princeton University

Organizers:
Boaz Barak, Princeton University, boaz at CS.Princeton.edu
Salil Vadhan, Harvard University, salil at eecs.harvard.edu
Presented under the auspices of the Special Focus on Hardness of Approximation and the Special Focus on Communication Security and Information Privacy.

Abstracts:

Dimitris Achlioptas

Title: Algorithmic Phase Transitions in Constraint Satisfaction Problems

For many Constraint Satisfaction Problems by now we have a good understanding of the largest constraint density for which solutions exist. At the same time, though, all known polynomial-time algorithms for these problems stop finding solutions at much smaller densities. We study this phenomenon by examining how the different sets of solutions evolve as constraints are added. We prove in a precise mathematical sense that, for each problem studied, the barrier faced by algorithms corresponds to a phase transition in that problem's solution-space geometry. Roughly speaking, at some problem-specific critical density, the set of solutions shatters and goes from being a single giant ball to exponentially many, well-separated, tiny pieces. All known polynomial-time algorithms work in the ball regime, but stop as soon as the shattering occurs. Besides giving a geometric view of the solution space of random CSPs our results provide novel constructions of one-way functions.

Joint work with Amin Coja-Oghlan.


Benny Applebaum

Title: Cryptography from Average-Case Hardness

We construct a public key cryptosystem based on the assumption that it is hard to solve n^{1.41} random 3-sparse equations modulo 2 in n variables, where an n^{-0.2} fraction of the equations are noisy.

This assumption seems different than previous assumptions used to construct public key systems. In particular, to our knowledge, it is the first noisy equations type public key system with noise magnitude larger than 1/sqrt{n}. We also show that our assumption is not refuted by certain natural algorithms such as Lassere SDP's, low-degree polynomials and ACO circuits, and, using Feige's 3XOR Principle, that the refutation variant of this assumption with the same parameters is implied by the hardness of refuting random 3CNFs with O(n^{1.41}) clauses.

We also construct a public key cryptosystem based on the hardness o the above problem with fewer equations- as low as c*n for a large constant c - at the expense of making an additional assumption about a planted dense subgraph problem in random 3-uniform hypergraphs. Joint work with Boaz Barak and Avi Wigderson.


Craig Gentry

Title: A Homomorphic Public Key Cryptosystem

We propose a fully homomorphic encryption scheme ? i.e., a scheme that allows one to evaluate circuits over encrypted data without being able to decrypt. Our solution comes in three steps. First, we provide a general result ? that, to construct an encryption scheme that permits evaluation of arbitrary circuits, it suffices to construct an encryption scheme that can evaluate (slightly augmented versions of) its own decryption circuit; we call a scheme that can evaluate its (augmented) decryption circuit bootstrappable.

Next, we describe a public key encryption scheme using ideal lattices that is almost bootstrappable. Lattice-based cryptosystems typically have decryption algorithms with low circuit complexity, often dominated by an inner product computation that is in NC1. Also, ideal lattices provide both additive and multiplicative homomorphisms (modulo a public-key ideal in a polynomial ring that is represented as a lattice), as needed to evaluate general circuits.

Unfortunately, our initial scheme is not quite bootstrappable ? i.e., the depth that the scheme can correctly evaluate can be logarithmic in the lattice dimension, just like the depth of the decryption circuit, but the latter is greater than the former. In the final step, we show how to modify the scheme to reduce the depth of the decryption circuit, and thereby obtain a bootstrappable encryption scheme, without reducing the depth that the scheme can evaluate. Abstractly, we accomplish this by enabling the encrypter to start the decryption process, leaving less work for the decrypter, much like the server leaves less work for decrypter in a server-aided cryptosystem.


Iftach Haitner

Title: Inaccessible Entropy

We put forth a new computational notion of entropy, which measures the (in)feasibility of sampling high entropy strings that are consistent with a given protocol. Specifically, we say that the i'th round of a protocol (A, B) has _accessible entropy_ at most k, if no polynomial-time strategy A^* can generate messages for A such that the entropy of its message in the i'th round has entropy greater than k when conditioned both on prior messages of the protocol and on prior coin tosses of A^*. We say that the protocol has _inaccessible entropy_ if the total accessible entropy (summed over the rounds) is noticeably smaller than the real entropy of A's messages, conditioned only on prior messages (but not the coin tosses of A). As applications of this notion, we

* Give a much simpler and more efficient construction of statistically
hiding commitment schemes from arbitrary one-way functions.

* Prove that constant-round statistically hiding commitments are
necessary for constructing constant-round zero-knowledge proof systems
for NP that remain secure under parallel composition (assuming the
existence of one-way functions).

Russell Impagliazzo

Title: Five Worlds of Problems

A personal view on average-case complexity'' introduced five possible worlds as metaphors for the outcomes of fundamental questions in complexity concerning whether and to what extent intractable problems exist. In this talk, I will try to put forward some concrete open problems addressing these issues.

I will emphasize those problems I believe are both within reach, in that there is no fundamental reason they should be difficult, but whose solutions would cast light on which of the five worlds is the case and what these worlds look like.


Yuval Ishai

Title: Efficiency vs. assumptions in secure computation

The talk will survey the state of the art in the complexity of secure computation, highlight connections with the topics of the workshop, and discuss the role of nonstandard assumptions in some of the current frontiers.


Moni Naor

Title: How Fair Can a Coin Toss Be?

Coin-flipping protocols allow mutually distrustful parties to generate a common unbiased random bit, guaranteeing that even if one of the parties is malicious, it cannot significantly bias the output of the honest party.

A classical result by Cleve from STOC 1986 showed that without simultaneous broadcast, for any two-party coin-flipping protocol with r rounds, there exists an efficient adversary that can bias the output of the honest party by Omega(1/r). However, the best previously known protocol only guarantees O(1/sqrt(r)) (based on Blum's protocol) and Cleve and Impagliazzo have shown that this is optimal in the fail-stop model (where the adversary is limited to early stopping but has unbounded computational power).

We establish the optimal tradeoff between the round complexity and the maximal bias of two-party coin-flipping protocols. Under standard assumptions, we show that Cleve's lower bound is tight: we construct an r-round protocol with bias O(1/r). We make use of recent progress by Gordon, Hazay, Katz and Lindell regarding fair protocols (STOC 2008).

Unlike Blum's coin flipping protocol that requires only the existence of any one-way function, our protocol (like the Protocol of Gordon et al) requires "cryptomania" style assumptions. One of the main interesting questions is whether this is essential.

Joint work with Tal Moran and Gil Segev


Chris Peikert

Title: Public-Key Cryptosystems Based from the Worst-Case Shortest Vector Problem

We construct public-key cryptosystems that are secure assuming the worst-case hardness of approximating the minimum distance on $n$-dimensional lattices to within small poly(n) factors. Prior cryptosystems with worst-case connections were based either on the shortest vector problem for a *special class* of lattices (Ajtai and Dwork, STOC 1997; Regev, J.~ACM 2004), or on the conjectured hardness of lattice problems for *quantum* algorithms (Regev, STOC 2005).

Our main technical innovation is a reduction from variants of the shortest vector problem to corresponding versions of the ``learning with errors'' problem; previously, only a quantum reduction of this kind was known. As an additional contribution, we construct a natural *chosen ciphertext-secure* cryptosystem having a much simpler description and tighter underlying worst-case approximation factor than prior schemes.


Amit Sahai

Title: A New Protocol Compiler

One of the most fundamental goals in cryptography is to design protocols that remain secure when adversarial participants can engage in arbitrary malicious behavior. In 1986, Goldreich, Micali, and Wigderson presented a powerful paradigm for designing such protocols: their approach reduced the task of designing secure protocols to designing protocols that only guarantee security against "honest-but-curious" participants. By making use of zero-knowledge proofs, the GMW paradigm enforces honest behavior without compromising secrecy. Over the past two decades, this approach has been the dominant paradigm for cryptographic protocol design.

In this talk, we present a new general paradigm / protocol compiler for secure protocol design. Our approach also reduces the task of designing secure protocols to designing protocols that only guarantee security against honest-but-curious participants. However, our approach avoids the use of zero-knowledge proofs, and instead makes use of multi-party protocols in a much simpler setting - where the majority of participants are completely honest (such multi-party protocols can exist without requiring any computational assumptions). Our paradigm yields protocols that rely on Oblivious Transfer (OT) as a building block. This offers a number of advantages in generality and efficiency.

In contrast to the GMW paradigm, by avoiding the use of zero-knowledge proofs, our paradigm is able to treat all of its building blocks as "black boxes". This allows us to improve over previous results in the area of secure computation. In particular, we obtain: * Conceptually simpler and more efficient ways for basing unconditionally secure cryptography on OT.

* More efficient protocols for generating a large number of OTs using
a small number of OTs.
* Secure and efficient protocols which only make a black-box use of
cryptographic primitives or underlying algebraic structures in
settings where no such protocols were known before.

This talk is based primarily on joint works with Yuval Ishai (Technion and UCLA) and Manoj Prabhakaran (UIUC).


Previous: Program
Workshop Index
DIMACS Homepage
Contacting the Center
Document last modified on May 29, 2009.