Serge Vaudenay


FFT-Hash-II is not yet Collision-free
In Advances in Cryptology CRYPTO'92, Santa Barbara, California, U.S.A., Lecture Notes in Computer Science No. 740, pp 587-593, Springer-Verlag, 1993.
In this paper, we show that the FFT-Hash function proposed by Schnorr is not collision free. Finding a collision requires about 2^24 computations of the basic function of FFT. This can be done in few hours on a SUN4-workstation. In fact, it is at most as strong as a one-way hash function which returns a 48 bits length value. Thus, we can invert the proposed FFT hash-function with 2^48 basic computations. Some simple improvements of the FFT hash function are also proposed to try to get rid of the weaknesses of FFT.


One-Time Identification with Low Memory
In Proceedings of EUROCODE'92, Udine, Italy, CISM Courses and lectures No. 339, pp. 217-228, Springer-Verlag, 1993.
In this paper, a practical cryptosystem based on an authentication tree is described. It is an extension of Ralph Merkle's results. The security of the system depends on assumptions on a one-way function. This cryptosystem allows to make interactive proofs of knowledge using keys only once. Thus, it can be used to prove one's identity or to sign some message. Moreover, we can use it to achieve an electronic cash system. A very efficient implementation can be done on a low-cost smart card. This cryptosystem is thus an alternative to the zero-knowledge algorithms.


Attacks on the Birational Permutation Signature
Joint work with Don Coppersmith and Jacques Stern
In Advances in Cryptology CRYPTO'93, Santa Barbara, California, U.S.A., Lecture Notes in Computer Science No. 773, pp. 435-443, Springer-Verlag, 1994.
(Click here for the Journal version.)
Shamir presents a family of cryptographic signature schemes based on birational permutations of the integers modulo a large integer N of unknown factorization. These schemes are attractive because of the low computational requirements, both for signature generation and signature verification. However, the two schemes presented in Shamir's paper are weak. We show here how to break the first scheme, by first reducing it algebraically to the earlier Ong-Schnorr-Shamir signature scheme, and then applying the Pollard solution to that scheme. We then show some attacks on the second scheme. These attacks give ideas which can be applied to schemes in this general family.


Links between differential and linear cryptanalysis
Joint work with Florent Chabaud
In Advances in Cryptology EUROCRYPT'94, Perugia, Italy, Lecture Notes in Computer Science No. 950, pp. 356-365, Springer-Verlag, 1995.
Linear cryptanalysis, introduced last year by Matsui, will most certainly open-up the way to new attack methods which may be made more efficient when compared or combined with differential cryptanalysis. This paper exhibits new relations between linear and differential cryptanalysis and presents new classes of functions which are optimally resistant to these attacks. In particular, we prove that linear-resistant functions, which generally present Bent properties, are differential-resistant as well and thus, present Perfect Nonlinear properties.


Parallel FFT-hashing
Joint work with Claus Schnorr
In Fast Software Encryption, Cambridge Security Workshop, United Kingdom, Lecture Notes in Computer Science No. 809, pp 149-156, Springer-Verlag, 1994.
We propose two families of scalable hash functions for collision-resistant hashing that are highly parallel and based on the generalized fast Fourier transform (FFT). FFT-hashing is based on multipermutations. This is a basic cryptographic primitive for perfect generation of diffusion and confusion which generalizes the boxes of the classic FFT. The slower FFT-hash functions iterate a compression function. For the faster FFT-hash fonctions all rounds are alike with the same number of message words entering each round.


Black box cryptanalysis of hash networks based on multipermutations
Joint work with Claus Schnorr
In Advances in Cryptology EUROCRYPT'94, Perugia, Italy, Lecture Notes in Computer Science No. 950, pp. 47-57, Springer-Verlag, 1995.
Black box cryptanalysis applies to hash algorithms consisting of many small boxes, connected by a known graph structure, so that the boxes can be evaluated forward and backwards by given oracles. We study attacks that work for any choice of the black boxes, i.e. we scrutinize the given graph structure. For example we analyze the graph of the fast Fourier transform (FFT). We present optimal black box inversions of FFT-compression functions and black box constructions of collisions. This determines the minimal depth of FFT-compression networks for collision-resistant hashing. We propose the concept of multipermutation, which is a pair of orthogonal latin squares, as a new cryptographic primitive that generalizes the boxes of the FFT. Our examples of multipermutations are based on the operations circular rotation, bitwise xor, addition and multiplication.


Complexity trade-offs with the Digital Signature Standard
Joint work with David M'Raihi, David Naccache and Dan Raphaeli
In Advances in Cryptology EUROCRYPT'94, Perugia, Italy, Lecture Notes in Computer Science No. 950, pp. 77-85, Springer-Verlag, 1995.
The Digital Signature Algorithm (DSA) was proposed in 1991 by the US National Institute of Standards and Technology to provide an appropriate core for applications requiring digital signatures. Undoubtelly, many applications will include this standard in the future and thus, the foreseen domination of DSA as legal certification tool is sufficiently important to focus research endeavours on the suitability of this scheme to various situations. In this paper, we present six new DSA-based protocols for: performing a quick batch-verification of n signatures; avoiding the cumbersome calculation of 1/k mod q by the signer; compressing sets of DSA transactions into shorter archive signatures; generating signatures from pre-calculated "Use & Throw" 224-bit signature-coupons; self-certifying the moduli and bit-patterning directly q on p. All our schemes combine in a natural way full DSA compatibility and flexible trade-offs between computational complexity, transmission overheads and key sizes.


On the need for multipermutations: Cryptanalysis of MD4 and SAFER
In Fast Software Encryption, Leuven, Belgium, Lecture Notes in Computer Science No. 1008, pp. 286-297, Springer-Verlag, 1995.
Cryptographic primitives are usually based on a network with some gates. In [SV94], it is claimed that all gates should be multipermutations. In this paper, we investigate a few combinatorial properties of multipermutations. We argue that gates which fail to be multipermutations can open the way to unsuspected attacks. We illustrate this statement with two examples. Firstly, we show how to construct collisions to MD4 restricted to its first two rounds. This allows to forge digests close to each other using the full compression function of MD4. Secondly, we show that some generalizations of SAFER are subject to attack faster than exhaustive search in 6.1% cases. This attack can be implemented if we decrease the number of rounds from 6 to 4.


The Security of Cryptographic Primitives
Thesis report. (Available in French only). Technical Report LIENS-95-10 of the Laboratoire d'Informatique de l'Ecole Normale Supérieure, 1995.
In the early fifties, Claude Shannon initiated the theory of cryptographic primitives. He defined the notion of diffusion and confusion. However, this theory did not developed very much until nowadays. Recently, the differential cryptanalysis and the linear cryptanalysis gave a significant advance in the analysis of the primitives. Security criteria for confusion, essentially nonlinearity criteria, has been proposed.

In this thesis, we show how to define a notion of complexity on the graph structure of the primitives and how to study it. This gives security criteria of the computational network. We propose new criteria for diffusion. Finally, we unify the two types of cryptanalysis, getting rid of their linear aspects by a statistical approach.


On the Weak Keys of Blowfish
In Fast Software Encryption, Cambridge, United Kingdom, Lecture Notes in Computer Science No. 1039, pp. 27-32, Springer-Verlag, 1996.
Blowfish is a sixteen-rounds Feistel cipher in which the F function is a part of the private key. In this paper, we show that the disclosure of F allows to perform a differential cryptanalysis which can recover all the rest of the key with 2^48 chosen plaintexts against a number of rounds reduced to eight. Moreover, for some weak F function, this attack only needs 2^23 chosen plaintexts against eight rounds, and 3x2^51 chosen plaintexts against sixteen-rounds. When the F function is safely kept private, one can detect whether it is weak or not with a differential attack using 2^22 plaintexts against eight rounds.


Black Box Cryptanalysis of Cryptographic Primitives
Joint work with Claus Schnorr.
Submitted to the Journal of Cryptology.
We analyse the security of a cryptographic primitive on the basis of the geometry of its computation graph. We assume the computation graph of the primitive to be given whereas the boxes sitting on the vertices of this graph are unknown and random, i.e. they are black boxes. We formalize and study a family of attacks which generalize exhaustive search and the birthday paradox. We establish complexity lower bounds for this family and we apply it to compression functions based on the FFT network.


An Experiment on DES - Statistical Cryptanalysis
In the Proceedings of the 3rd ACM Conferences on Computer Security, New Delhi, India, pp. 139-147, ACM Press, 1996. (The proceedings version is (c) Copyright by the ACM 1995.)
Linear cryptanalysis and differential cryptanalysis are the most important methods of attack against block ciphers. Their efficiency have been demonstrated against several ciphers, including the Data Encryption Standard. We prove that both of them can be considered, improved and joined in a more general statistical framework. We also show that the very same results as those obtained in the case of DES can be found without any linear analysis and we slightly improve them into an attack with theoretical complexity 2^42.9. We can apply another statistical attack - the chi^2-cryptanalysis - on the same characteristics without a definite idea of what happens in the encryption process. It appears to be roughly as efficient as both differential and linear cryptanalysis. We propose a new heuristic method to find good characteristics. It has found an attack against DES absolutely equivalent to Matsui's one by following a distinct path.


Authenticated Multi-Party Key Agreement
Joint work with Mike Just
In Advances in Cryptology ASIACRYPT'96, Kyongju, Korea, Lecture Notes in Computer Science No. 1163, pp. 26-35, Springer-Verlag, 1996.
We examine multi-party key agreement protocols that provide (i) key authentication, (ii) key confirmation and (iii) forward secrecy. Several minor (repairable) attacks are presented against previous two-party key agreement schemes and a model for key agreement is presented that provably provi des the properties listed above. A generalization of the Burmester-Desmedt model (Eurocrypt '94) for multi-party key agreement is given, allowing a transformation of any two-party key agreement scheme into a multi-party scheme. Multi-party schemes (based on the general model and two specific two-party schem es) are presented that reduce the number of rounds required for key computation compared to the specific Burmester-Desmedt scheme. It is also shown how the specific Burmester-Desmedt scheme fails to provide key authentication.


Cryptography Théorie et Pratique by Douglas Stinson
(c) Copyright by International Thomson Publishing France, 1996
French translation of:
Cryptography Theory and Practice
(c) Copyright by CRC Press, Inc., 1995


Hidden Collisions on DSS
In Advances in Cryptology CRYPTO'96, Santa-Barbara, California, U.S.A, Lecture Notes in Computer Science No. 1109, pp. 83--88 Springer-Verlag, 1996.
We explain how to forge public parameters for the Digital Signature Standard with two known messages which always produce the same set of valid signatures (what we call a collision). This attack is thwarted by using the generation algorithm suggested in the specifications of the Standard, so it proves one always need to check proper generation. We also present a similar attack when using this generation algorithm within a complexity 2^74, which is better than the birthday attack which seeks for collisions on the underlying hash function.


Minding your p's and q's
Joint work with Ross Anderson
In Advances in Cryptology ASIACRYPT'96, Kyongju, Korea, Lecture Notes in Computer Science No. 1163, pp. 26-35, Springer-Verlag, 1996.
Over the last year or two, a large number of attacks have been found by the authors and others on protocols based on the discrete logarithm problem, such as ElGamal signature and Diffie Hellman key exchange. These attacks depend on causing variables to assume values whose discrete logarithms can be calculated, whether by forcing a protocol exchange into a smooth subgroup or by choosing degenerate values directly. We survey these attacks and discuss how to build systems that are robust against them. In the process we elucidate a number of the design decisions behind the US Digital Signature Standard.


On Provable Security for Digital Signature Algorithms
Joint work with David Pointcheval
In this paper we consider provable security for ElGamal-like digital signature schemes. We point out that the good the security criterion on the underlying hash function is pseudorandomness. We extend Pointcheval-Stern's results about the use of the random oracle model to prove the security of two variants of the US Digital Signature Algorithm against adaptive attacks which issue an existential forgery. We prove that a very practical use of the random oracle model is possible whith tamper-resistant modules.


Serge Vaudenay