Close

Presentation

GuardNN: Secure Accelerator Architecture for Privacy-Preserving Deep Learning
TimeTuesday, July 12th3:30pm - 3:54pm PDT
Location3006, Level 3
Event Type
Research Manuscript
Keywords
AI/ML Security/Privacy
Topics
AI
DescriptionThis paper proposes GuardNN, a secure DNN accelerator that provides strong hardware-based protection for user data and model parameters even in an untrusted environment. GuardNN shows that the architecture and protection can be customized for a specific application to provide strong confidentiality and integrity guarantees with negligible overhead. The design of the GuardNN instruction set reduces the TCB to just the accelerator. GuardNN also introduces a new application-specific memory protection scheme to minimize the overhead of memory encryption and integrity verification. The scheme removes most of the off-chip meta-data by exploiting the known memory access patterns of a DNN accelerator.