Jul 12, 2020
In this paper, we propose a new loss function for linear autoencoders (LAEs) and then analytically identify the structure of the loss surface. Optimizing the conventional Mean Square Error (MSE) loss results in a decoder matrix that spans the principal subspace of the sample covariance of the data, but fails to identify the exact eigenvectors. This shortcoming originates from an invariance that cancels out in the global map. Here, we prove that our loss function eliminates this issue, i.e., the decoder converges to the exact ordered unnormalized eigenvectors of the sample covariance matrix. For this new loss, we characterize the full structure of the loss landscape in the following sense: we establish analytical expression for the set of all critical points, show that it is a subset of critical points of MSE, and that all local minima are still global. However, the invariant global minima under MSE become saddle points under the new loss. Moreover, we show that the order of computational complexity of the loss and its gradients are the same as MSE and, hence, the new loss is not only of theoretical importance but is of practical value, e.g., for low-rank approximation.
The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%
Presentations on similar topic, category or speaker