Close

Presentation

Contrastive Quant: Quantization Makes Stronger Contrastive Learning
TimeTuesday, July 12th2:37pm - 3pm PDT
Location3000, Level 3
Event Type
Research Manuscript
Keywords
ML Algorithms and Applications
Topics
AI
DescriptionContrastive learning learns visual representations by enforcing feature consistency under different augmented views. In this work, we explore contrastive learning from a new perspective. Interestingly, we find that quantization, when properly engineered, can enhance the effectiveness of contrastive learning. To this end, we propose a novel contrastive learning framework, dubbed Contrastive Quant, to encourage the feature consistency under both differently augmented inputs via various data transformations and differently augmented weights/activations via various quantization levels. Extensive experiments, built on top of two state-of-the-art contrastive learning methods SimCLR and BYOL, show that Contrastive Quant consistently improves the learned visual representation.