DIMACS Workshop on Security Analysis of Protocols

June 7 - 9, 2004
DIMACS Center, CoRE Building, Rutgers University, Piscataway, NJ

Organizers:
John Mitchell, Stanford, mitchell@cs.stanford.edu
Ran Canetti, IBM Watson, canetti@watson.ibm.com
Presented under the auspices of the Special Focus on Communication Security and Information Privacy.

Abstracts:


Gergei Bana, University of Pennsylvania

Title: Computational and Information-Theoretic Soundness and Completeness of the Expanded Logics of Formal Encryption

We consider expansions of the Abadi-Rogaway logic of indistinguishability of formal cryptographic expressions. The language of the Abadi-Rogaway logic uses a box as formal notation for indecipherable strings. In the expanded formalism we adopt, different kinds of boxes are allowed, which - loosely speaking - correspond to different kinds of indecipherable strings. We establish soundness and completeness for a variety of interpretations. Two such interpretations are discussed in detail: a purely probabilistic one that interprets formal expressions in One-Time Pad, and another one in the so-called type 2 (which-key revealing) cryptosystems based on computational complexity. We present a new, general technique for proving completeness. The work has been carried out in collaboration with P. Adao and A. Scedrov.

In the case of One-Time Pad, a natural expansion of the Abadi-Rogaway logic involves boxes indexed by the length of the encrypted message (and the matching length of the encrypting key). On the one hand, we define the formal equivalence of expressions with the help of such boxes. On the other hand, we postulate One-Time-Pad realizations of two formal expressions equivalent, iff their probability distributions are identical. We show both soundness and completeness for this interpretation.

In the case of type 2 cryptosystems, we first show soundness of the Abadi-Rogaway interpretation for an expanded logic that includes a number of distinct boxes, one for each key in this case. We also show completeness of such expanded logic for computationally confusion-free, which-key revealing cryptosystems. By "which-key revealing", we mean that there exists a probabilistic polynomial-time adversary that can distinguish two pairs of oracles, one always encrypting with the same key and the other encrypting with different keys. We also assume that decryption with the wrong key is computationally distinguishable from the correct decryption; this is what we call computational confusion-freeness.

Our technique for proving completeness differs from the method that Micciancio and Warinschi used in the case of type 0 cryptosystems (which hide all information, including the length of the message and the key used to encrypt). Given two formal expressions and the assumption that their (computational or information-theoretic) interpretations are equivalent, we analyze the formal expressions to find the matching keys and ciphertexts within the probabilistic samples from the equivalent interpretations.


Iliano Cervesato, NRL

Title: Fine-Grained MSR Specifications for Quantitative Security Analysis

The traditional Dolev-Yao model of security limits attacks to ``computationally feasible'' operations. We depart from this model by assigning a cost to protocol actions, both of the Dolev-Yao kind as well as non traditional forms such as computationally-hard operations, guessing, principal subversion, and failure. This quantitative approach enables evaluating protocol resilience to various forms of denial of service, guessing attacks, and resource limitation. While the methodology is general, we demonstrate it through a low-level variant of the MSR specification language.


Juan Garay, Bell Labs - Lucent Technology

Title: A Framework for Fair Multi-Party Computation

We study the problem of constructing secure multi-party computation (MPC) protocols that are *completely fair* --- meaning that either all the parties learn the output of the function, or nobody does --- even when a majority of the parties are corrupted. We first propose a framework for fair multi-party computation (FMPC), within which we formulate a definition of secure and fair protocols. The FMPC framework is a variation of the Universal Composability framework, but with modifications so that the ideal process is (intuitively) fair. The definition follows the standard simulation paradigm, but is modified to allow the protocol to depend on the runing time of the adversary. In this way, we avoid a well-known impossibility result for fair MPC with corrupted majority; in particular, our definition admits constructions that tolerate up to (n-1) corruptions, where n is the total number of parties. Next, we define a ``commit-prove-fair-open'' functionality and construct an efficient protocol that realizes it, using a new variant of a cryptographic primitive known as ``time-lines.'' With this functionality, we show that some of the existing secure MPC protocols can be easily transformed into fair protocols while preserving their security.

Joint work with Phil MacKenzie (Bell Labs) and Ke Yang (CMU).


Joshua Guttman, MITRE
F. Javier Thayer

Title: Authentication Tests: Analyzing and Designing Cryptographic Protocols

In this talk, we use the strand space formalism to study cryptographic protocols. We present a widely applicable method, which we call the authentication test method, to determine exactly what authentication and secrecy goals a protocol achieves. We will also explain how to use the same ideas as a heuristic to create new (demonstrably correct) protocols, developing a new electronic commerce protocol as an example.


Jonathan Herzog, MIT

Title: Universally Composable Symbolic Analysis of Cryptographic Protocols

We demonstrate how Dolev-Yao style symbolic analysis can be used to assert the security of cryptographic protocols within the universally composable (UC) security framework. Consequently, our methods enable security analysis that is cryptographically sound with strong composability properties, and at the same time completely formal.

More specifically, we show how to translate a protocol in the UC framework to a corresponding symbolic protocol, while guaranteeing that if the symbolic protocol satisfies a certain symbolic condition then the original protocol is UC-secure. We concentrate on encryption-based mutual authentication and key exchange protocols. For the first task, our results can be regarded as lifting the Micciancio-Warinschi (TCC 2004) analysis to the UC framework. For key exchange, we demonstrate that the traditional Dolev-Yao style symbolic condition is insufficient, and formulate an adequate symbolic condition.

Unlike prior work, we concentrate on analyzing single-instance protocols, relying on the UC theorem for obtaining security of the multi-instance case. This considerably simplifies our analysis and is also conducive to more modular analysis of higher-level protocols.

Joint work with Ran Canetti.


Ralf Kuesters, University of Kiel

Title: Sequential Process Calculus and Machine Models for Simulation-based Security

We present a sequential probabilistic polynomial-time process calculus (SPPC) that serves as a basis for comparing related work on simulation-based security. Using representations of communicating Turing machines and IO Automata in SPPC, we are able to compare three related simulation-based security notions: universal composability, black-box-simulatability, and process observational equivalence (or strong simulatability). We prove some expected relationships between these notions and observe some surprising differences that would not be apparent without detailed analysis.

Joint work with Anupam Datta, John Mitchell, and Ajith Ramanathan


Yehuda Lindell, IBM Research

Title: Tutorial - Secure Composition of Multiparty Protocols

In modern network settings, secure protocols are run concurrently (or "composed") with other arbitrary protocols. The interaction of different protocols with each other can be exploited by malicious parties to mount successful attacks on protocols that are secure when considered in isolation. In order to ensure security in modern network settings like the Internet, these "new" adversarial threats must be explicitly considered. In this tutorial, we survey what is known regarding the feasibility of obtaining security in this general adversarial setting. We consider a number of different models and present both positive and negative results that provide a rather comprehensive picture of feasibility. We will also discuss research directions and goals for the future.


Nancy Lynch, MIT

Title: Modeling and Analyzing Security Protocols Using I/O Automata

This talk describes formal models and proofs for the composition of two security protocols: a simple shared-key communication protocol and the Diffie-Hellman key distribution protocol. The basis of this work is the I/O automata framework, an interacting state-machine framework originally developed for distributed algorithms. Proofs are based on the standard I/O automata techniques: invariant assertions, forward simulation relations, and compositional reasoning. Arguments about component interactions are clearly separated from arguments about the underlying cryptosystems. Proofs have been checked, by Sheyner and Wing, using an interactive theorem-prover.

The I/O automata framework seems to be capable of modeling many security protocols and properties, in particular, those involving secrecy and authentication. However, certain other protocols and properties, especially those involving indistinguishability, appear to require a probabilistic version of I/O automata. I believe that applying probabilistic I/O automata to such protocols is quite feasible, and is a good direction for future research.


Dominic Mayers, Caltech

Title: Documented Ideal Protocols: A Flexible Notion of Universal Composability for Simple Protocols and no Trusted Setup

The universally composable security definition of Canetti is generalized to use a new kind of ideal protocols, the documented ideal protocols. A documented ideal protocol uses ideal channels and an incorruptible party just as an ordinary ideal protocol. As in the standard case, the definition says that we can replace the protocol by its associated (documented) ideal protocol and a simulator. As in the standard case, the composability of the higher level application protocol that previously called the original protocol can now be proven more easily because it now calls an (a documented) ideal protocol. The main difference is that the simulator in our approach can execute special operations that must be specified in the documented ideal protocol and taken into account when we consider the security of the higher level application protocol.

The goal is to include in the universally composable framework a much larger class of protocols, including simple protocols that do not realise any standard ideal protocol. In particular, this technique allows us to define a documented ideal protocol for a malleable bit commitment. If the application protocol requires non-malleability, the use of the documentation will automatically determine that the malleable bit commitment cannot be used. There is not even a need to know the concept of malleability. Two simple bit commitment protocols are proven universally composable with respect to this new definition. To illustrate the framework, these composable bit commitments are used in some simple application protocols.


Silvio Micali, MIT

Title: Collusion-Free Protocols

Traditionally, secure protocols just minimize the damage inflictable by malicious colluding players, but do not prevent it.

We put forward a new notion of secure protocols, Collusion-Free Protocols, in which malicious parties are prevented from colluding during run time, and show how to implement it under general complexity assumptions.

A key feature of our protocols is that they make steganography PROVABLY impossible.

Joint Work with Matt Lepinski and Abhi Shelat


Daniele Micciancio, UCSD

Title: Tutorial: Towards cryptographically sound formal analysis

In the last few years, there has been a growing interest in bridging the gap between symbolic and computational approaches to the modeling and analysis of security protocols, i.e., coming up with symbolic analysis methods that allow to prove computational security properties as typically considered by the cryptographic community. In this tutorial we will present a brief overview of the area, followed by a detailed case study exemplifying how symbolic methods can be profitably used by cryptographers to prove computational security properties. The example will also be used to pinpoint limitation of current cryptographic techniques, illustrate how to cope with these limitations within the symbolic setting, and describe research directions and goals for the future in the areas of cryptography, formal methods and establishing connections between the two.


Marinella Petrocchi, IIT-CNR, Italy

Joint work with Maurice ter Beek, ISTI-CNR, Italy and Gabriele Lenzini, University of Twente, the Netherlands

Title: A Framework for Security Analysis with Team Automata

In this talk we will show a framework based on team automata that can be effectively used for formal security analysis. To this aim, we will first define an insecure communication scenario for team automata, which is general enough to encompass various communication protocols. Secondly, we will reformulate the Generalized Non-Deducibility on Compositions schema---originally introduced in the context of process algebras---in terms of team automata. Based on the resulting framework, we will subsequently develop a compositional analysis strategy that can be used for the verification of security properties in a variety of communication protocols.


Olivier Pereira, UCL

Title: Dolev-Yao-type Abstraction of Modular Exponentiation - the Cliques Case Study

Models for cryptographic protocols analysis based on the so-called ``Dolev-Yao abstraction'' usually consider a message algebra made of nonces, user identifiers and keys, these atomic elements being composed through encryption and concatenation.

Such a message algebra however does not allow to reason about protocols the security of which is based on other cryptographic primitives. We consider a message algebra based on a Dolev-Yao abstraction of the modular exponentiation: atomic elements are elements of a group G, and these elements are considered exponentiated through nonces and keys or concatenated.

A systematic reasoning with this very simple algebra on the case study of the Cliques authenticated group key agreement protocols allowed us to show that these protocols do not achieve the expected security properties and, more generally, that it is impossible to define a scalable Cliques-type authenticated group key agreement protocol guaranteeing implicit key authentication.

As far as we know, this is the first such generic insecurity result reported in the literature of authentication protocols, and we think it is a good example of the benefits of considering protocols from a high-level (Dolev-Yao-type) of abstraction.


Manoj Prabhakaran and Amit Sahai, Princeton University

Title: New Notions of Security: Universal Composability without Trusted Setup

We propose a modification to the framework of Universally Composable (UC) security [Canetti'01], which enables us to give secure protocols for tasks for which no secure protocol is possible in the original UC framework (except with trusted setup).

Our new notion, involves comparing the protocol executions with an ideal execution involving ideal functionalities (just as in UC-security), but allowing the environment and adversary access to some super-polynomial computational power. We argue the meaningfulness of the new notion, which in particular subsumes many of the traditional notions of security.

We generalize the Universal Composition theorem of [Canetti] to the new setting. Then under new computational assumptions, we realize secure multiparty computation (for static adversaries), without a common reference string or any other setup assumptions, in the new framework. This is known to be impossible under the UC framework.


Andre Scedrov, University of Pennsylvania

Title: A probabilistic polynomial-time calculus for the analysis of cryptographic protocols

We describe properties of a process calculus that has been developed for the purpose of analyzing security protocols. The process calculus is a restricted form of CCS, with bounded replication and probabilistic polynomial-time expressions allowed in messages and boolean tests. We develop properties of a form of asymptotic protocol equivalence that allows security to be specified using observational equivalence, a standard relation from programming language theory that involves quantifying over possible environments that might interact with the protocol. We relate process equivalence to cryptographic concepts such as computational indistinguishability. Using a form of probabilistic bisimulation we develop an equational proof system for reasoning about process equivalence. This proof system is sufficiently powerful to derive the semantic security of ElGamal encryption from the Decision Diffie-Hellman (DDH) assumption. The proof system can also derive the converse: if ElGamal is secure, then DDH holds. While these are not new cryptographic results, these example proofs show the power of probabilistic bisimulation and equational reasoning for protocol security.

The work has been carried out in collaboration with J. Mitchell, A. Ramanathan, and V. Teague.


Vitaly Shmatikov, SRI

Title: Tutorial: Constraint-based methods: Adding computational properties to symbolic models

I will introduce the symbolic constraint solving method for cryptographic protocols, and describe recent decidability results for formal protocol analysis in the presence of XOR, Abelian group operator, and modular exponentiation from an arbitrary base. For a finite number of sessions, these results enable fully automated, sound and complete formal analysis of protocols that employ primitives such as Diffie-Hellman exponentiation and modular multiplication. The constraint solving method takes into account relevant algebraic properties of cryptographic primitives, without requiring an a priori bound on the size of terms created by the attacker.


Sabrina Tarento, INRIA

Title: A machine-checker formalization of the generic model and the random oracle model

most approaches to the formal analyses of cryptographic protocols make the perfect cryptographic assumption, i.e the hypothese that there is no way to obtain knowledge about the plaintext pertaining to a ciphertext without knowing the key. Ideally, one would prefer to rely on a weaker hypothesis on hte computational cost of gaining information about the plaintext pertaining to a ciphertext without knowing the key. Such a view is permitted by the generic model and the random oracle model which provide non-standard computational models in which one may reason about the computational cost of breaking a cryptographic scheme. Using the proof assistant Coq, we provide a machine- checker account of the Generic Model and the Random Oracle Model.


Angelo Troina, Alessandro Aldini, and Roberto Gorrieri, University of Pisa

Title: Message Equivalence and Imperfect Cryptography in a Formal Model

We present a formal view of cryptography that overcomes the usual assumptions of formal models for reasoning about security of computer systems, i.e. perfect cryptography and Dolev-Yao adversary model. Polynomial time adversaries, based on a computational view of cryptography, have additional capabilities that the classical Dolev-Yao adversary model does not include. In our framework, equivalence among formal cryptographic expressions is parameterized by a computational adversary that may exploit weaknesses of the cryptosystem to cryptanalyze ciphertext with a certain probability of success. The obtained result is that if the cryptosystem is robust enough, then the two adversary models turn out to be equivalent. As an application of our approach, we show how to determine a secrecy property against the computational adversary.


Thomas Wilke, Kiel University

Title: Automata-based analysis of recursive cryptographic protocols

Cryptographic protocols can be divided into (1) protocols where the protocol steps are simple from a computational point of view and can thus be modeled by simple means, for instance, single rewrite rules---we call these protocols non-looping---and (2) protocols, such as group protocols, where the protocol steps are complex and typically involve an iterative or recursive computation---we call them recursive. While much is known on the decidability of security for non-looping protocols, only little is known for recursive protocols. We show the decidability of security (w.r.t. the standard Dolev-Yao intruder) for a core class of recursive protocols and undecidability. The key ingredient of our protocol model are specifically designed tree transducers which work over infinite signatures and have the ability to generate new constants (which allow us to mimic key generation). The decidability result is based on an automata-theoretic construction which involves a new notion of regularity, designed to work well with the infinite signatures we use.


Previous: Program
Workshop Index
DIMACS Homepage
Contacting the Center
Document last modified on June 4, 2004.