Towards Accurate Post-training Network Quantization via Bit-Split and Stitching

Jul 12, 2020

Sprecher:innen

Über

Network quantization is essential for deploying deep models to IoT devices due to the high efficiency, no matter on special hardware like TPU or general hardware like CPU and GPU. Most existing quantization approaches rely on retraining to retain accuracy, which is referred to as quantization-aware training. However, this quantization scheme assumes the access to the training data, which is not always the case. Moreover, retraining is a tedious and time-consuming procedure, which hinders the application of quantization to a wider range of tasks. Post-training quantization, on the other hand, does not have these problems. However, it has only been shown effective for 8-bit quantization due to the simple optimization strategy. In this paper, we propose a Bit-Split and Stitching framework for lower-bit post-training quantization with minimal accuracy degradation. The proposed framework are validated on a variety of computer vision tasks, including image classification, object detection, instance segmentation, with various network architectures.

Organisator

Kategorien

Über ICML 2020

The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

Präsentation speichern

Soll diese Präsentation für 1000 Jahre gespeichert werden?

Wie speichern wir Präsentationen?

Ewigspeicher-Fortschrittswert: 0 = 0.0%

Freigeben

Empfohlene Videos

Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

Interessiert an Vorträgen wie diesem? ICML 2020 folgen