International Association for Cryptologic Research

International Association
for Cryptologic Research

CryptoDB

Worst and Average Case Hardness of Decoding via Smoothing Bounds

Authors:
Thomas Debris-Alazard , Inria and Laboratoire d'Informatique de l'École Polytechnique (LIX)
Nicolas Resch , University of Amsterdam
Download:
Search ePrint
Search Google
Conference: PKC 2025
Abstract: In this work, we consider worst and average case hardness of decoding problems that are the basis for code-based cryptography. By a decoding problem, we refer to problems that take inputs of the form~$$(\mathbf{G}, \mathbf{m} \mathbf{G} + \mathbf{t})$ for a matrix $$\mathbf{G}$$ (which generates a code) and a noise vector $$\mathbf{t}$$, and the goal is to recover $$\mathbf{m}$$. We consider a natural strategy for creating a reduction to an average-case problem: from our input we simulate a Learning Parity with Noise (LPN) oracle, where we recall that LPN is essentially an average-case decoding problem where there is no a priori lower bound on the rate of the code. More formally, the oracle $$\mathcal{O}_{\mathbf x}$$ outputs independent samples of the form $$\langle \mathbf x, \mathbf a \rangle + e$$, where $$\mathbf a$$ is a uniformly random vector and $$e$$ is a noise bit. Such an approach is (implicit in) the previous worst-case to average-case reductions for coding problems (Brakerski et al Eurocrypt 2019, Yu and Zhang CRYPTO 2021). To analyze the effectiveness of this reduction, we use a smoothing bound derived recently by (Debris-Alazard et al, IEEE IT 2023), which quantifies the simulation error of this reduction. It is worth noting that this latter work crucially use a bound, known as the second linear programming bound, on the weight distribution of the code generated here by $$\mathbf{G}$$. Our approach, which is Fourier analytic in nature, applies to any smoothing distribution (so long as it is radial); for our purposes, the best choice appears to be Bernoulli (although for the analysis it is most effective to study the uniform distribution over a sphere, and subsequently translate the bound back to the Bernoulli distribution by applying a truncation trick). Our approach works naturally when reducing from a worst-case instance, as well as from an average-case instance. While we are unable to improve the parameters of the worst-case to average-case reductions of Brakerski et al or Yu and Zhang, we think that our work highlights two important points. Firstly, in analyzing the average-case to average-case reduction we run into inherent limitations of this reduction template. Essentially, it appears hopeless to reduce to an LPN instance for which the noise rate is more than inverse-polynomially biased away from uniform. We furthermore uncover a surprising weakness in the second linear programming bound: we observe that it is essentially useless for the regime of parameters where the rate of the code is inverse polynomial in the block-length. By highlighting these shortcomings, we hope to stimulate the development of new techniques for reductions between cryptographic decoding problems.
BibTeX
@inproceedings{pkc-2025-35162,
  title={Worst and Average Case Hardness of Decoding via Smoothing Bounds},
  publisher={Springer-Verlag},
  author={Thomas Debris-Alazard and Nicolas Resch},
  year=2025
}