Towards Accurate Post-training Network Quantization via Bit-Split and Stitching

12. Červenec 2020

Řečníci

O prezentaci

Network quantization is essential for deploying deep models to IoT devices due to the high efficiency, no matter on special hardware like TPU or general hardware like CPU and GPU. Most existing quantization approaches rely on retraining to retain accuracy, which is referred to as quantization-aware training. However, this quantization scheme assumes the access to the training data, which is not always the case. Moreover, retraining is a tedious and time-consuming procedure, which hinders the application of quantization to a wider range of tasks. Post-training quantization, on the other hand, does not have these problems. However, it has only been shown effective for 8-bit quantization due to the simple optimization strategy. In this paper, we propose a Bit-Split and Stitching framework for lower-bit post-training quantization with minimal accuracy degradation. The proposed framework are validated on a variety of computer vision tasks, including image classification, object detection, instance segmentation, with various network architectures.

Organizátor

Kategorie

O organizátorovi (ICML 2020)

The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

Uložení prezentace

Měla by být tato prezentace uložena po dobu 1000 let?

Jak ukládáme prezentace

Pro uložení prezentace do věčného trezoru hlasovalo 0 diváků, což je 0.0 %

Sdílení

Doporučená videa

Prezentace na podobné téma, kategorii nebo přednášejícího

Zajímají Vás podobná videa? Sledujte ICML 2020