auto-draft-216@NDSS

Total: 1

#1 Hiding My Real Self! Protecting Intellectual Property in Additive Manufacturing Systems Against Optical Side-Channel Attacks [PDF] [Copy] [Kimi1]

Authors: Sizhuang Liang (Georgia Institute of Technology) ; Saman Zonouz (Rutgers University) ; Raheem Beyah (Georgia Institute of Technology)

We propose an optical side-channel attack to recover intellectual property in Additive Manufacturing (AM) systems. Specifically, we use a deep neural network to estimate the coordinates of the printhead as a function of time by analyzing the video of the printer frame by frame. We found that the deep neural network can successfully recover the path for an arbitrary printing process. By using data augmentation, the neural network can tolerate a certain level of variation in the position and angle of the camera as well as the lighting conditions. The neural network can intelligently perform interpolation and accurately recover the coordinates of an image that is not seen in the training dataset. To defend against the optical side-channel attack, we propose to use the optical noise injection method. Specifically, we use an optical projector to artificially inject carefully crafted optical noise onto the printing area in an attempt to confuse the attacker and make it harder to recover the printing path. We found that existing noise generation algorithms, such as replaying, random blobs, white noise, and full power, can effortlessly defeat a naive attacker who is not aware of the existence of the injected noise. However, an advanced attacker who knows about the injected noise and incorporates images with injected noise in the training dataset can defeat all of the existing noise generation algorithms. To defend against such an advanced attacker, we propose three novel noise generation algorithms: channel uniformization, state uniformization, and state randomization. Our experiment results show that noise generated via state randomization can successfully defend against the advanced attacker.