Close

Presentation

NN-LUT: Neural Approximation of Non-Linear Operations for Efficient Transformer Inference
TimeWednesday, July 13th2:15pm - 2:37pm PDT
Location3002, Level 3
Event Type
Research Manuscript
Keywords
AI/ML Design: Circuits and Architecture
Topics
Design
DescriptionNon-linear operations such as GELU, Layer normalization, and Softmax are essential yet costly building blocks of Transformer models. Several prior works simplified these operations with look-up tables or integer computations, but such approximations suffer inferior accuracy or considerable hardware cost with long latency. This paper proposes an accurate and hardware-friendly approximation framework for efficient Transformer inference. Our framework employs a simple neural network as a universal approximator with its structure equivalently transformed into a LUT. The proposed framework called NN-LUT can accurately replace all the non-linear operations in popular BERT models with significant reductions in area, power consumption, and latency.