Transformed Data and Faster-SPDZ : Techniques for Privacy-Preserving Deep Learning in Real-World Applications
TimeWednesday, July 13th6pm - 7pm PDT
LocationLevel 2 Lobby
DescriptionOrganizations employ privacy-preserving deep learning (PPDL) protocols to train their machine learning models ethically, without the need of private dataset holders disclosing their information. However, standard PPDL protocols like secure multi-party computation and homomorphic encryption involve significant computation overhead that reduces their scope of real-world deployment. This paper proposes two generalized approaches, 'Transformed Data' and 'Faster-SPDZ', that give comparable accuracy (92.3% and 91.53% with 'Transformed Data' for FMNIST and Malaria Detection datasets respectively) to models trained on raw data, with reduced computation latency (3X and 12X with 'Faster-SPDZ' for FMNIST and Malaria Detection datasets respectively) associated with state-of-the-art PPDL protocols.