Hardware-Efficient Stochastic Rounding Unit Design for DNN Training
TimeWednesday, July 13th6pm - 7pm PDT
LocationLevel 2 Lobby
Late Breaking Results Poster
DescriptionStochastic rounding is crucial in the training of low-bit deep neural networks (DNNs) to achieve high accuracy. Unfortunately, prior studies require a large number of high-precision stochastic rounding units (SRUs) to guarantee the low-bit DNN accuracy, which involves considerable hardware overhead. In this paper, we propose an automated framework to explore hardware-efficient low-bit SRUs (ESRUs) that can still generate high-quality random numbers to guarantee the accuracy of low-bit DNN training. Experimental results using state-of-the-art DNN models demonstrate that, compared to the prior 24-bit SRU with 24-bit pseudo random number generator (PRNG), our 8-bit ESRU with 3-bit PRNG reduces the SRU resource usage by 9.75× while achieving a higher accuracy.