International Association for Cryptologic Research

International Association
for Cryptologic Research

CryptoDB

Alon Rosen

Publications

Year
Venue
Title
2024
CRYPTO
CDS Composition of Multi-Round Protocols
We revisit the Cramer, Damg{\aa}rd, Schoenmakers (CDS) approach for composing sigma protocols, and adapt it to a setting in which the underlying protocols have multiple rounds of interaction. The goal of CDS composition is to prove compound NP-relations by combining multiple ``atomic'' proof systems. Its key feature is that it interacts with the atomic proofs in a generic fashion, enabling simpler and more efficient implementation. Recent developments in multi-round protocols call for the adaptation of CDS composition beyond its original scope, which not only was restricted to three-move protocols but in fact fails in the multi-round case, as well as in the composition of so-called $k$-special sound proofs. We propose a new method for multi-round composition in the plain model, in a soundness preserving way and with an ``offline'' zero-knowledge simulation property. The need for handling arbitrary monotone access structures in $\mathsf{mNC}^1$, which is all Boolean function families represented by polynomial-size formulas over some fixed complete basis, leads us to identify a complexity theoretic problem of independent interest. Prior to our work, multi-round composition was either restricted to the random oracle model, or worked only for argument systems, and moreover required heavy ``online'' zero-knowledge simulation.
2024
TCC
Low-degree Security of the Planted Random Subgraph Problem
The planted random subgraph detection conjecture of Abram et al. (TCC 2023) asserts the pseudorandomness of a pair of graphs $(H, G)$, where $G$ is an Erdos-Renyi random graph on $n$ vertices, and $H$ is a random induced subgraph of $G$ on $k$ vertices. Assuming the hardness of distinguishing these two distributions (with two leaked vertices), Abram et al. construct communication-efficient, computationally secure (1) 2-party private simultaneous messages (PSM) and (2) secret sharing for forbidden graph structures. We prove low-degree hardness of detecting planted random subgraphs all the way up to $k\leq n^{1 - \Omega(1)}$. This improves over Abram et al.'s analysis for $k \leq n^{1/2 - \Omega(1)}$. The hardness extends to $r$-uniform hypergraphs for constant $r$. Our analysis is tight in the distinguisher's degree, its advantage, and in the number of leaked vertices. Extending the constructions of Abram et al, we apply the conjecture towards (1) communication-optimal multiparty PSM protocols that are secure even against multiple random evaluations and (2) bit secret sharing with share size $(1 + \epsilon)\log n$ for any $\epsilon > 0$ in which arbitrary coalitions of up to $r$ parties can reconstruct and secrecy holds against all unqualified subsets of up to $\ell = o(\epsilon \log n)^{1/(r-1)}$ parties.
2023
TCC
Public-Key Encryption, Local Pseudorandom Generators, and the Low-Degree Method
The low-degree method postulates that no efficient algorithm outperforms low-degree polynomials in certain hypothesis-testing tasks. It has been used to understand computational indistinguishability in high-dimensional statistics. We explore the use of the low-degree method in the context of cryptography. To this end, we apply it in the design and analysis of a new public-key encryption scheme whose security is based on Goldreich's pseudorandom generator. The scheme is a combination of two proposals of Applebaum, Barak, and Wigderson, and inherits desirable features from both.
2022
TCC
Public-Key Encryption from Homogeneous CLWE
The homogeneous continuous LWE (hCLWE) problem is to distinguish samples of a specific high-dimensional Gaussian mixture from standard normal samples. It was shown to be at least as hard as Learning with Errors, but no reduction in the other direction is currently known. We present four new public-key encryption schemes based on the hardness of hCLWE, with varying tradeoffs between decryption and security errors, and different discretization techniques. Our schemes yield a polynomial-time algorithm for solving hCLWE using a Statistical Zero-Knowledge oracle.
2021
CRYPTO
Secure Computation from One-Way Noisy Communication, or: Anti-Correlation via Anti-Concentration 📺
Can a sender encode a pair of messages (m_0,m_1) jointly, and send their encoding over (say) a binary erasure channel, so that the receiver can decode exactly one of the two messages and the sender does not know which one? Garg et al. (Crypto 2015) showed that this is information-theoretically impossible. We show how to circumvent this impossibility by assuming that the receiver is computationally bounded, settling for an inverse-polynomial security error (which is provably necessary), and relying on ideal obfuscation. Our solution creates a ``computational anti-correlation'' between the events of receiving m_0 and receiving m_1 by exploiting the anti-concentration of the binomial distribution. The ideal obfuscation primitive in our construction can either be directly realized using (stateless) tamper-proof hardware, yielding an unconditional result, or heuristically instantiated using existing indistinguishability obfuscation schemes. We put forward a new notion of obfuscation that suffices to securely instantiate our construction. As a corollary, we get similar feasibility results for general secure computation of sender-receiver functionalities by leveraging the completeness of the above ``random oblivious transfer'' functionality.
2021
CRYPTO
Time- and Space-Efficient Arguments from Groups of Unknown Order 📺
We construct public-coin time- and space-efficient zero-knowledge arguments for NP. For every time T and space S non-deterministic RAM computation, the prover runs in time T * polylog(T) and space S * polylog(T), and the verifier runs in time n * polylog(T), where n is the input length. Our protocol relies on hidden order groups, which can be instantiated with a trusted setup from the hardness of factoring (products of safe primes), or without a trusted setup using class groups. The argument-system can heuristically be made non-interactive using the Fiat-Shamir transform. Our proof builds on DARK (Bunz et al., Eurocrypt 2020), a recent succinct and efficiently verifiable polynomial commitment scheme. We show how to implement a variant of DARK in a time- and space-efficient way. Along the way we: 1. Identify a significant gap in the proof of security of Dark. 2. Give a non-trivial modification of the DARK scheme that overcomes the aforementioned gap. The modified version also relies on significantly weaker cryptographic assumptions than those in the original DARK scheme. Our proof utilizes ideas from the theory of integer lattices in a novel way. 3. Generalize Pietrzak's (ITCS 2019) proof of exponentiation (PoE) protocol to work with general groups of unknown order (without relying on any cryptographic assumption). In proving these results, we develop general-purpose techniques for working with (hidden order) groups, which may be of independent interest.
2021
TCC
Acyclicity Programming for Sigma-Protocols 📺
Cramer, Damgård, and Schoenmakers (CDS) built a proof system to demonstrate the possession of subsets of witnesses for a given collection of statements that belong to a prescribed access structure P by composing so-called sigma-protocols for each atomic statement. Their verifier complexity is linear in the size of the monotone span program representation of P. We propose an alternative method for combining sigma-protocols into a single non-interactive system for a compound statement in the random oracle model. In contrast to CDS, our verifier complexity is linear in the size of the acyclicity program representation of P, a complete model of monotone computation introduced in this work. We show that the acyclicity program size of a predicate is never larger than its de Morgan formula size and it is polynomially incomparable to its monotone span program size. We additionally present an extension of our proof system, with verifier complexity linear in the monotone circuit size of P, in the common reference string model. Finally, considering the types of statement that naturally reduce to acyclicity programming, we discuss several applications of our new methods to protecting privacy in cryptocurrency and social networks.
2021
JOFC
Limits on the Efficiency of (Ring) LWE-Based Non-interactive Key Exchange
$$\mathsf {LWE}$$ LWE -based key-exchange protocols lie at the heart of post-quantum public-key cryptography. However, all existing protocols either lack the non-interactive nature of Diffie–Hellman key exchange or polynomial $$\mathsf {LWE}$$ LWE -modulus, resulting in unwanted efficiency overhead. We study the possibility of designing non-interactive $$\mathsf {LWE}$$ LWE -based protocols with polynomial $$\mathsf {LWE}$$ LWE -modulus. To this end, we identify and formalize simple non-interactive and polynomial $$\mathsf {LWE}$$ LWE -modulus variants of the existing protocols, where Alice and Bob simultaneously exchange one or more (ring) $$\mathsf {LWE}$$ LWE samples with polynomial $$\mathsf {LWE}$$ LWE -modulus and then run individual key reconciliation functions to obtain the shared key. We point out central barriers and show that such non-interactive key-exchange protocols are impossible in either of the following cases: (1) the reconciliation functions first compute the inner product of the received $$\mathsf {LWE}$$ LWE sample with their private $$\mathsf {LWE}$$ LWE secret. This impossibility is information theoretic. (2) One of the reconciliation functions does not depend on the error of the transmitted $$\mathsf {LWE}$$ LWE sample. This impossibility assumes hardness of $$\mathsf {LWE}$$ LWE . We show that progress toward either a polynomial $$\mathsf {LWE}$$ LWE -modulus $$\text {NIKE}$$ NIKE construction or a general impossibility result has implications to the current understanding of lattice-based cryptographic constructions. Overall, our results show possibilities and challenges in designing simple (ring) $$\mathsf {LWE}$$ LWE -based non-interactive key-exchange protocols.
2021
JOFC
Can PPAD Hardness be Based on Standard Cryptographic Assumptions?
Alon Rosen Gil Segev Ido Shahaf
We consider the question of whether PPAD hardness can be based on standard cryptographic assumptions, such as the existence of one-way functions or public-key encryption. This question is particularly well-motivated in light of new devastating attacks on obfuscation candidates and their underlying building blocks, which are currently the only known source for PPAD hardness. Central in the study of obfuscation-based PPAD hardness is the sink-of-verifiable-line (SVL) problem, an intermediate step in constructing instances of the PPAD-complete problem source-or-sink . Within the framework of black-box reductions, we prove the following results: (i) average-case PPAD hardness (and even SVL hardness) does not imply any form of cryptographic hardness (not even one-way functions). Moreover, even when assuming the existence of one-way functions, average-case PPAD hardness (and, again, even SVL hardness) does not imply any public-key primitive. Thus, strong cryptographic assumptions (such as obfuscation-related ones) are not essential for average-case PPAD hardness. (ii) Average-case SVL hardness cannot be based either on standard cryptographic assumptions or on average-case PPAD hardness. In particular, average-case SVL hardness is not essential for average-case PPAD hardness. (iii) Any attempt for basing the average-case hardness of the PPAD-complete problem source-or-sink on standard cryptographic assumptions must result in instances with a nearly exponential number of solutions. This stands in striking contrast to the obfuscation-based approach, which results in instances having a unique solution. Taken together, our results imply that it may still be possible to base PPAD hardness on standard cryptographic assumptions, but any such black-box attempt must significantly deviate from the obfuscation-based approach: It cannot go through the SVL problem, and it must result in source-or-sink instances with a nearly exponential number of solutions.
2020
EUROCRYPT
Fine-Grained Cryptography: A New Frontier?
Alon Rosen
Fine-grained cryptography is concerned with adversaries that are only moderately more powerful than the honest parties. We will survey recent results in this relatively underdeveloped area of study and examine whether the time is ripe for further advances in it.
2020
PKC
Limits on the Efficiency of (Ring) LWE Based Non-interactive Key Exchange 📺
$$mathsf {LWE}$$ based key-exchange protocols lie at the heart of post-quantum public-key cryptography. However, all existing protocols either lack the non-interactive nature of Diffie-Hellman key-exchange or polynomial $$mathsf {LWE}$$ -modulus, resulting in unwanted efficiency overhead. We study the possibility of designing non-interactive $$mathsf {LWE}$$ -based protocols with polynomial $$mathsf {LWE}$$ -modulus. To this end, We identify and formalize simple non-interactive and polynomial $$mathsf {LWE}$$ -modulus variants of existing protocols, where Alice and Bob simultaneously exchange one or more (ring) $$mathsf {LWE}$$ samples with polynomial $$mathsf {LWE}$$ -modulus and then run individual key reconciliation functions to obtain the shared key. We point out central barriers and show that such non-interactive key-exchange protocols are impossible if: (1) the reconciliation functions first compute the inner product of the received $$mathsf {LWE}$$ sample with their private $$mathsf {LWE}$$ secret. This impossibility is information theoretic. (2) One of the reconciliation functions does not depend on the error of the transmitted $$mathsf {LWE}$$ sample. This impossibility assumes hardness of $$mathsf {LWE}$$ . We give further evidence that progress in either direction, of giving an $$mathsf {LWE}$$ -based $$mathrm {NIKE}$$ protocol or proving impossibility of one will lead to progress on some other well-studied questions in cryptography. Overall, our results show possibilities and challenges in designing simple (ring) $$mathsf {LWE}$$ -based non-interactive key exchange protocols.
2020
TCC
Public-Coin Zero-Knowledge Arguments with (almost) Minimal Time and Space Overheads 📺
Zero-knowledge protocols enable the truth of a mathematical statement to be certified by a verifier without revealing any other information. Such protocols are a cornerstone of modern cryptography and recently are becoming more and more practical. However, a major bottleneck in deployment is the efficiency of the prover and, in particular, the space-efficiency of the protocol. For every $\mathsf{NP}$ relation that can be verified in time $T$ and space $S$, we construct a public-coin zero-knowledge argument in which the prover runs in time $T \cdot \mathrm{polylog}(T)$ and space $S \cdot \mathrm{polylog}(T)$. Our proofs have length $\mathrm{polylog}(T)$ and the verifier runs in time $T \cdot \mathrm{polylog}(T)$ (and space $\mathrm{polylog}(T)$). Our scheme is in the random oracle model and relies on the hardness of discrete log in prime-order groups. Our main technical contribution is a new space efficient \emph{polynomial commitment scheme} for multi-linear polynomials. Recall that in such a scheme, a sender commits to a given multi-linear polynomial $P:\mathbb{F}^n \to \mathbb{F}$ so that later on it can prove to a receiver statements of the form ``$P(x)=y$''. In our scheme, which builds on commitments schemes of Bootle et al. (Eurocrypt 2016) and B{\"u}nz et al. (S\&P 2018), we assume that the sender is given multi-pass streaming access to the evaluations of $P$ on the Boolean hypercube and we show how to implement both the sender and receiver in roughly time $2^n$ and space $n$ and with communication complexity roughly $n$.
2020
ASIACRYPT
Cryptography from One-Way Communication: On Completeness of Finite Channels 📺
Garg et al. (Crypto 2015) initiated the study of cryptographic protocols over noisy channels in the non-interactive setting, namely when only one party speaks. A major question left open by this work is the completeness of {\em finite} channels, whose input and output alphabets do not grow with the desired level of security. In this work, we address this question by obtaining the following results: Completeness of Bit-ROT with Inverse Polynomial Error: We show that bit-ROT (i.e., Randomized Oblivious Transfer channel, where each of the two messages is a single bit) can be used to realize general randomized functionalities with inverse polynomial error. Towards this, we provide a construction of string-ROT from bit-ROT with inverse polynomial error. No Finite Channel is Complete with Negligible Error: To complement the above, we show that {\it no} finite channel can be used to realize string-ROT with negligible error, implying that the inverse polynomial error in the completeness of bit-ROT is inherent. This holds even with semi-honest parties and for computational security, and is contrasted with the (negligible-error) completeness of string-ROT shown by Garg et al. Characterization of Finite Channels Enabling Zero-Knowledge Proofs: An important instance of secure computation is zero-knowledge proofs. Noisy channels can potentially be used to realize truly non-interactive zero-knowledge proofs, without trusted common randomness, and with non-transferability and deniability features that cannot be realized in the plain model. Garg et al. obtain such zero-knowledge proofs from the binary erasure channel (BEC) and the binary symmetric channel (BSC). We complete the picture by showing that in fact any non-trivial channel suffices.
2020
ASIACRYPT
Non-Interactive Composition of Sigma-Protocols via Share-then-Hash 📺
Proofs of partial knowledge demonstrate the possession of certain subsets of witnesses for a given collection of statements x_1,\dots,x_n. Cramer, Damg{\aa}rd, and Schoenmakers (CDS), built proofs of partial knowledge, given "atomic" protocols for individual statements x_i, by having the prover randomly secret share the verifier's challenge and using the shares as challenges for the atomic protocols. This simple and highly-influential transformation has been used in numerous applications, ranging from anonymous credentials to ring signatures. We consider what happens if, instead of using the shares directly as challenges, the prover first hashes them. We show that this elementary enhancement can result in significant benefits: - the proof contains a {\em single} atomic transcript per statement x_i, - it suffices that the atomic protocols are k-special sound for k \geq 2, - when compiled using the Fiat-Shamir heuristic, the protocol retains its soundness in the {\em non-programmable} random oracle model. None of the above features is satisfied by the CDS transformation.
2018
EUROCRYPT
2018
CRYPTO
Proofs of Work From Worst-Case Assumptions 📺
We give Proofs of Work (PoWs) whose hardness is based on well-studied worst-case assumptions from fine-grained complexity theory. This extends the work of (Ball et al., STOC ’17), that presents PoWs that are based on the Orthogonal Vectors, 3SUM, and All-Pairs Shortest Path problems. These, however, were presented as a ‘proof of concept’ of provably secure PoWs and did not fully meet the requirements of a conventional PoW: namely, it was not shown that multiple proofs could not be generated faster than generating each individually. We use the considerable algebraic structure of these PoWs to prove that this non-amortizability of multiple proofs does in fact hold and further show that the PoWs’ structure can be exploited in ways previous heuristic PoWs could not.This creates full PoWs that are provably hard from worst-case assumptions (previously, PoWs were either only based on heuristic assumptions or on much stronger cryptographic assumptions (Bitansky et al., ITCS ’16)) while still retaining significant structure to enable extra properties of our PoWs. Namely, we show that the PoWs of (Ball et al., STOC ’17) can be modified to have much faster verification time, can be proved in zero knowledge, and more.Finally, as our PoWs are based on evaluating low-degree polynomials originating from average-case fine-grained complexity, we prove an average-case direct sum theorem for the problem of evaluating these polynomials, which may be of independent interest. For our context, this implies the required non-amortizability of our PoWs.
2017
TCC
2017
TCC
2017
TCC
2016
TCC
2016
TCC
2016
TCC
2016
JOFC
2015
TCC
2015
TCC
2015
TCC
2014
CRYPTO
2014
TCC
2014
CHES
2014
FSE
2013
CRYPTO
2012
TCC
2012
TCC
2012
EUROCRYPT
2011
TCC
2010
PKC
2010
ASIACRYPT
2009
TCC
2009
TCC
2009
TCC
2008
FSE
2007
JOFC
2006
TCC
2006
JOFC
2005
EUROCRYPT
2004
TCC
2004
TCC
2000
CRYPTO

Program Committees

Crypto 2024
TCC 2024
TCC 2022
TCC 2021
TCC 2020
Crypto 2020
TCC 2019 (Program chair)
TCC 2019
Eurocrypt 2018
TCC 2018
Eurocrypt 2015
Asiacrypt 2014
TCC 2013
PKC 2012
TCC 2010
PKC 2010
Crypto 2008
Eurocrypt 2007
TCC 2005