Bake It Till You Make It

Heat-induced Power Leakage from Masked Neural Networks

Authors

  • Dev M. Mehta Worcester Polytechnic Institute, Worcester, USA
  • Mohammad Hashemi Worcester Polytechnic Institute, Worcester, USA
  • David S. Koblah University of Florida, Gainesville, USA
  • Domenic Forte University of Florida, Gainesville, USA
  • Fatemeh Ganji Worcester Polytechnic Institute, Worcester, USA

DOI:

https://doi.org/10.46586/tches.v2024.i4.569-609

Keywords:

Side-channel Analysis, Masking, Neural Networks, Heat Generation, T-test, DPA, FPGA

Abstract

Masking has become one of the most effective approaches for securing hardware designs against side-channel attacks. Regardless of the effort put into correctly implementing masking schemes on a field-programmable gate array (FPGA), leakage can be unexpectedly observed. This is due to the fact that the assumption underlying all masked designs, i.e., the leakages of different shares are independent of each other, may no longer hold in practice. In this regard, extreme temperatures have been shown to be an important factor in inducing leakage, even in correctlymasked designs. This has previously been verified using an external heat generator (i.e., a climate chamber). In this paper, we examine whether the leakage can be induced using the circuit components themselves without making any changes to the design. Specifically, we target masked neural networks (NNs) in FPGAs, one of the main building blocks of which is block random access memory (BRAM). In this respect, thanks to the inherent characteristics of NNs, our novel internal heat generators leverage solely the memories devoted to storing the user’s input, especially when frequently writing alternating patterns into BRAMs. The possibility of observing first-order leakage is evaluated by considering one of the most recent and successful first-order secure masked NNs, namely ModuloNET. ModuloNET is specifically designed for FPGAs, where BRAMs are used to store inputs and intermediate computations. Our experimental results demonstrate that undesirable first-order leakage can be observed and exploited by increasing the temperature when an alternating input is applied to the masked NN. To give a better understanding of the impact of extreme heat, we further perform a similar test on the design using an external heat generator, where a similar conclusion can be drawn.

Downloads

Published

2024-09-05

Issue

Section

Articles

How to Cite

Bake It Till You Make It: Heat-induced Power Leakage from Masked Neural Networks. (2024). IACR Transactions on Cryptographic Hardware and Embedded Systems, 2024(4), 569-609. https://doi.org/10.46586/tches.v2024.i4.569-609