Part ofAdvances in Neural Information Processing Systems 32 (NeurIPS 2019)
Daniel Brooks, Olivier Schwander, Frederic Barbaresco, Jean-Yves Schneider, Matthieu Cord
Covariance matrices have attracted attention for machine learning applications dueto their capacity to capture interesting structure in the data. The main challengeis that one needs to take into account the particular geometry of the Riemannianmanifold of symmetric positive definite (SPD) matrices they belong to. In the con-text of deep networks, several architectures for these matrices have recently beenproposed. In our article, we introduce a Riemannian batch normalization (batch-norm) algorithm, which generalizes the one used in Euclidean nets. This novellayer makes use of geometric operations on the manifold, notably the Riemannianbarycenter, parallel transport and non-linear structured matrix transformations. Wederive a new manifold-constrained gradient descent algorithm working in the spaceof SPD matrices, allowing to learn the batchnorm layer. We validate our proposedapproach with experiments in three different contexts on diverse data types: adrone recognition dataset from radar observations, and on emotion and actionrecognition datasets from video and motion capture data. Experiments show thatthe Riemannian batchnorm systematically gives better classification performancecompared with leading methods and a remarkable robustness to lack of data.
Requests for name changes in the electronic proceedings will be accepted with no questions asked. However name changes may cause bibliographic tracking issues. Authors are asked to consider this carefully and discuss it with their co-authors prior to requesting a name change in the electronic proceedings.
Use the "Report an Issue" link to request a name change.