Feb 19, 2021
Semi-supervised learning, i.e., training networks with both labeled and unlabeled data, has made significant progress recently. However, existing works have primarily focused on image classification tasks and neglected object detection which requires more annotation effort. In this work, we revisit the Semi-Supervised Object Detection (SS-OD) and identify the pseudo-labeling bias issue in SS-OD. To address this, we introduce Unbiased Teacher, a simple yet effective approach that jointly trains a student and a gradually progressing teacher in a mutually-beneficial manner. Together with a class-balance loss to downweight overly confident pseudo-labels, Unbiased Teacher consistently improved state-of-the-art methods by significant margins on COCO-standard, COCO-additional, and VOC datasets. Specifically, Unbiased Teacher achieves 6.8 absolute mAP improvements against state-of-the-art method when using 1% of labeled data on MS-COCO, achieves around 10 mAP improvements against the supervised baseline when using only 0.5, 1, 2% of labeled data on MS-COCO.
Written by
Yen-Cheng Liu
Chih-Yao Ma
Zijian He
Chia-Wen Kuo
Kan Chen
Peizhao Zhang
Bichen Wu
Zsolt Kira
Peter Vajda
Publisher
ICLR 2021
Research Topics
December 05, 2020
Deepak Pathak, Abhinav Gupta, Mustafa Mukadam, Shikhar Bahl
December 05, 2020
December 07, 2020
Yuandong Tian, Qucheng Gong, Tina Jiang
December 07, 2020
March 13, 2021
Baohe Zhang, Raghu Rajan, Luis Pineda, Nathan Lambert, Andre Biedenkapp, Kurtland Chua, Frank Hutter, Roberto Calandra
March 13, 2021
October 10, 2020
Luis Pineda, Sumana Basu, Adriana Romero,Roberto CalandraRoberto Calandra, Michal Drozdzal
October 10, 2020
December 05, 2020
Andrea Tirinzonin, Matteo Pirotta, Marcello Restelli, Alessandro Lazaric
December 05, 2020
Product experiences
Foundational models
Product experiences
Latest news
Foundational models