CryptoDB
Field Instruction Multiple Data
Authors: |
|
---|---|
Download: | |
Presentation: | Slides |
Conference: | EUROCRYPT 2022 |
Abstract: | Fully homomorphic encryption~(FHE) has flourished since it was first constructed by Gentry~(STOC 2009). Single instruction multiple data~(SIMD) gave rise to efficient homomorphic operations on vectors in \((\mathbb{F}_{t^d})^\ell\), for prime \(t\). RLWE instantiated with cyclotomic polynomials of the form \(X^{2^N}+1\) dominate implementations of FHE due to highly efficient fast fourier transformations. However, this choice yields very short SIMD plaintext vectors and high degree extension fields, e.g. \(\ell < 100, d > 100\) for small primes~(\(t = 3, 5, \dots\)). In this work, we describe a method to encode more data on top of SIMD, \emph{Field Instruction Multiple Data}, applying reverse multiplication friendly embedding~(RMFE) to FHE. With RMFE, length-\(k\) \(\mathbb{F}_{t}\) vectors can be encoded into \(\mathbb{F}_{t^d}\) and multiplied once. The results have to be recoded~(decoded and then re-encoded) before further multiplications can be done. We introduce an FHE-specific technique to additionally evaluate arbitrary linear transformations on encoded vectors for free during the FHE recode operation. On top of that, we present two optimizations to unlock high degree extension fields with small \(t\) for homomorphic computation: \(r\)-fold RMFE, which allows products of up to \(2^r\) encoded vectors before recoding, and a three-stage recode process for RMFEs obtained by composing two smaller RMFEs. Experiments were performed to evaluate the effectiveness of FIMD from various RMFEs compared to standard SIMD operations. Overall, we found that FIMD generally had \(>2\times\) better (amortized) multiplication times compared to FHE for the same amount of data, while using almost \(k/2 \times\) fewer ciphertexts required. |
Video from EUROCRYPT 2022
BibTeX
@inproceedings{eurocrypt-2022-31876, title={Field Instruction Multiple Data}, publisher={Springer-Verlag}, author={Khin Mi Mi Aung and Enhui Lim and Jun Jie Sim and Benjamin Hong Meng Tan and Huaxiong Wang and Sze Ling Yeo}, year=2022 }