CryptoDB
Aparna Gupte
Publications
Year
Venue
Title
2024
CRYPTO
How to Construct Quantum FHE, Generically
Abstract
We construct a (compact) quantum fully homomorphic encryption (QFHE) scheme starting from {\em any} (classical) fully homomorphic encryption scheme (with decryption in $\mathsf{NC}^{1}$) together with a dual-mode trapdoor claw-free function family. Compared to previous constructions (Mahadev, FOCS 2018; Brakerski, CRYPTO 2018) which made non-black-box use of similar underlying primitives, our construction provides a pathway to instantiations from different assumptions. Our construction uses the techniques of Dulek, Schaffner and Speelman (CRYPTO 2016) and shows how to make the client in their QFHE scheme classical using claw-free trapdoor functions. As an additional contribution, we show a new instantiation of dual-mode trapdoor claw-free functions from group actions.
2024
TCC
Sparse Linear Regression and Lattice Problems
Abstract
Sparse linear regression (SLR) is a well-studied problem in statistics where one is given a design matrix $\mathbf{X} \in \mathbb{R}^{m \times n}$ and a response vector $\mathbf{y} = \mathbf{X} \boldsymbol{\theta}^* + \mathbf{w}$ for a $k$-sparse vector $\boldsymbol{\theta}^*$ (that is, $\|\boldsymbol{\theta}^*\|_0 \leq k$) and small, arbitrary noise $\mathbf{w}$, and the goal is to find a $k$-sparse $\widehat{\boldsymbol{\theta}} \in \mathbb{R}^{n}$ that minimizes the mean squared prediction error $\frac{1}{m} \|\mathbf{X} \widehat{\boldsymbol{\theta}} - \mathbf{X} \boldsymbol{\theta}^*\|^2_2$. While $\ell_1$-relaxation methods such as basis pursuit, Lasso, and the Dantzig selector solve SLR when the design matrix is well-conditioned, no general algorithm is known, nor is there any formal evidence of hardness in an average-case setting with respect to all efficient algorithms.
We give evidence of average-case hardness of SLR w.r.t. all efficient algorithms assuming the worst-case hardness of lattice problems. Specifically, we give an instance-by-instance reduction from a variant of the bounded distance decoding (BDD) problem on lattices to SLR, where the condition number of the lattice basis that defines the BDD instance is directly related to the restricted eigenvalue condition of the design matrix, which characterizes some of the classical statistical-computational gaps for sparse linear regression. Also, by appealing to worst-case to average-case reductions from the world of lattices, this shows hardness for a distribution of SLR instances; while the design matrices are ill-conditioned, the resulting SLR instances are in the identifiable regime.
Furthermore, for well-conditioned (essentially) isotropic Gaussian design matrices, where Lasso is known to behave well in the identifiable regime, we show hardness of outputting any good solution in the unidentifiable regime where there are many solutions, assuming the worst-case hardness of standard and well-studied lattice problems.
Coauthors
- Aparna Gupte (2)
- Neekon Vafa (1)
- Vinod Vaikuntanathan (2)