Jul 12, 2020
Researches using margin based comparison loss demonstrated the effectiveness of penalizing the distance between face feature and their corresponding class centers. Despite their popularity and excellent performance, they do not explicitly encourage the generic embedding learning for an open set recognition problem. In this paper, we analysed margin based softmax loss in probability view and proposed two general principles: 1) monotonic decreasing and 2) margin probability penalty, for designing new margin based loss functions. Unlike methods optimized with single comparison metric, we provide a new perspective to treat open set face recognition as a problem of information transmission. And we find the face embedding gained more generalization capability when the learning process directly suggest to increase the clean information passed to it. An auto-encoder architecture called Linear-Auto-TS-Encoder(LATSE) is proposed to corroborate this findings. Extensive experiments on several benchmarks demonstrated face embedding learned with LATSE gain more generalization capability and it boosted the single model performance with open training dataset to more than 99
The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%
Presentations on similar topic, category or speaker