SHAPER: A General Architecture for Privacy-Preserving Primitives in Secure Machine Learning
DOI:
https://doi.org/10.46586/tches.v2024.i2.819-843Keywords:
Privacy-Preserving Machine Learning, Multi-Party Computation, Additive Homomorphic Encryption, Hardware AcceleratorAbstract
Secure multi-party computation and homomorphic encryption are two primary security primitives in privacy-preserving machine learning, whose wide adoption is, nevertheless, constrained by the computation and network communication overheads. This paper proposes a hybrid Secret-sharing and Homomorphic encryption Architecture for Privacy-pERsevering machine learning (SHAPER). SHAPER protects sensitive data in encrypted or randomly shared domains instead of relying on a trusted third party. The proposed algorithm-protocol-hardware co-design methodology explores techniques such as plaintext Single Instruction Multiple Data (SIMD) and fine-grained scheduling, to minimize end-to-end latency in various network settings. SHAPER also supports secure domain computing acceleration and the conversion between mainstream privacy-preserving primitives, making it ready for general and distinctive data characteristics. SHAPER is evaluated by FPGA prototyping with a comprehensive hyper-parameter exploration, demonstrating a 94x speed-up over CPU clusters on large-scale logistic regression training tasks.
Downloads
Published
Issue
Section
License
Copyright (c) 2024 Ziyuan Liang, Qi’ao Jin, Zhiyong Wang, Zhaohui Chen, Zhen Gu, Yanhheng Lu, Fan Zhang
This work is licensed under a Creative Commons Attribution 4.0 International License.