Abstract
This paper introduces the unsupervised assignment flow that couples the assignment flow for supervised image labeling (Åström et al. in J Math Imaging Vis 58(2):211–238, 2017) with Riemannian gradient flows for label evolution on feature manifolds. The latter component of the approach encompasses extensions of state-of-the-art clustering approaches to manifold-valued data. Coupling label evolution with the spatially regularized assignment flow induces a sparsifying effect that enables to learn compact label dictionaries in an unsupervised manner. Our approach alleviates the requirement for supervised labeling to have proper labels at hand, because an initial set of labels can evolve and adapt to better values while being assigned to given data. The separation between feature and assignment manifolds enables the flexible application which is demonstrated for three scenarios with manifold-valued features. Experiments demonstrate a beneficial effect in both directions: adaptivity of labels improves image labeling, and steering label evolution by spatially regularized assignments leads to proper labels, because the assignment flow for supervised labeling is exactly used without any approximation for label learning.
Similar content being viewed by others
Notes
The maps \(\widehat{g}\) and \(\widehat{g}^{-1}\) are sometimes denoted with \(\flat \) and \(\sharp \) in the literature (‘musical isomorphism’). We stick to the notation from [27] here.
The symbol ‘k’ is commonly used in the literature. We prefer in this paper, however, the more specific symbol J as index set for prototypes and use k (like i, j, etc.) as free index.
Note that the symbol \(\exp _{p}\) does not contain the subscript \(\mathcal {S}\) in order to distinguish it from the definition (2.5) for general manifolds \(\mathcal {M}\).
References
Amari, S.I., Cichocki, A.: Information geometry of divergence functions. Bull. Pol. Acad. Sci. Tech. 58(1), 183–195 (2010)
Amari, S.I., Nagaoka, H.: Methods of Information Geometry. Amer. Math. Soc. and Oxford Univ. Press, Providence (2000)
Åström, F., Petra, S., Schmitzer, B., Schnörr, C.: Image labeling by assignment. J. Math. Imaging Vis. 58(2), 211–238 (2017)
Ay, N., Jost, J., Lê, H.V., Schwachhöfer, L.: Information Geometry. Springer (2017)
Banerjee, A., Merugu, S., Dhillon, I.S., Ghosh, J.: Clustering with Bregman divergences. J. Mach. Learn. Res. 6, 1705–1749 (2005)
Barndorff-Nielsen, O.E.: Information and Exponential Families in Statistical Theory. Wiley, Chichester (1978)
Basseville, M.: Divergence measures for statistical data processing—an annotated bibliography. Signal Proc. 93(4), 621–633 (2013)
Batagelj, V.: Generalized ward and related clustering problems. In: Classification and Related Methods of Data Analysis, pp. 67–74 (1988)
Bauschke, H.H., Borwein, J.M.: Legendre functions and the method of random Bregman projections. J. Convex Anal. 4(1), 27–67 (1997)
Bergmann, R., Fitschen, J.H., Persch, J., Steidl, G.: Iterative multiplicative filters for data labeling. Int. J. Comput. Vis. 123(3), 435–453 (2017)
Bhatia, R.: Positive Definite Matrices. Princeton Univ. Press, Princeton (2006)
Bonnans, J.F., Shapiro, A.: Perturbation Analysis of Optimization Problems. Springer, New York (2013)
Censor, Y.A., Zenios, S.A.: Parallel Optimization: Theory, Algorithms, and Applications. Oxford Univ. Press, New York (1997)
Cherian, A., Sra, S.: Positive definite matrices: data representation and applications to computer vision. In: Minh, H., Murino, V. (eds.) Algorithmic Advances in Riemannian Geometry and Applications, pp. 93–114. Springer, Cham (2016)
Cherian, A., Sra, S., Banerjee, A., Papanikolopoulos, N.: Jensen-Bregman LogDet Divergence with Application to Efficient Similarity Search for Covariance Matrices. IEEE PAMI 35(9), 2161–2174 (2013)
Comaniciu, D., Meer, P.: Mean shift: a robust approach toward feature space analysis. IEEE Trans. Pattern Anal. Mach. Intell. 24(5), 603–619 (2002)
Fukunaga, K., Hostetler, L.: The estimation of the gradient of a density function, with applications in pattern recognition. IEEE Trans. Inform. Theory 21(1), 32–40 (1975)
Har-Peled, S.: Geometric Approximation Algorithms. AMS, Providence (2011)
Harandi, M., Hartley, R., Lovell, B., Sanderson, C.: Sparse coding on symmetric positive definite manifolds using Bregman divergences. IEEE Trans. Neural Netw. Learn. Syst. 27(6), 1294–1306 (2016)
Higham, N.: Functions of Matrices: Theory and Computation. SIAM, Philadelphia (2008)
Hofmann, T., Schölkopf, B., Smola, A.J.: Kernel methods in machine learning. Ann. Stat. 36(3), 1171–1220 (2008)
Hühnerbein, R., Savarino, F., Petra, S., Schnörr, C.: Learning adaptive regularization for image labeling using geometric assignment. In: Proc. SSVM. Springer (2019)
Jost, J.: Riemannian Geometry and Geometric Analysis, 7th edn. Springer-Verlag, Berlin Heidelberg (2017)
Kappes, J., Andres, B., Hamprecht, F., Schnörr, C., Nowozin, S., Batra, D., Kim, S., Kausler, B., Kröger, T., Lellmann, J., Komodakis, N., Savchynskyy, B., Rother, C.: A comparative study of modern inference techniques for structured discrete energy minimization problems. Int. J. Comput. Vis. 115(2), 155–184 (2015)
Karcher, H.: Riemannian center of mass and mollifier smoothing. Commun. Pure Appl. Math. 30, 509–541 (1977)
Kleefeld, A., Meyer-Baese, A., Burgeth, B.: Elementary morphology for SO(2)-and SO(3)-orientation fields. In: International Symposium on Mathematical Morphology and Its Applications to Signal and Image Processing, pp. 458–469. Springer (2015)
Lee, J.M.: Introduction to Smooth Manifolds. Springer, New York (2013)
McLachlan, G., Peel, D.: Finite Mixture Models. Wiley, New York (2000)
Müllner, D.: Modern Hierarchical, Agglomerative Clustering Algorithms. arXiv preprint arXiv:1109.2378 (2011)
Rockafellar, R.T., Wets, R.J.B.: Variational Analysis, 3rd edn. Springer, New York (2009)
Schnörr, C.: Assignment flows. In: P. Grohs, M. Holler, A. Weinmann (eds.) Variational Methods for Nonlinear Geometric Data and Applications. Springer (in press) (2019)
Sra, S.: Positive Definite Matrices and the Symmetric Stein Divergence. CoRR arXiv:1110.1773 (2013)
Subbarao, R., Meer, P.: Nonlinear mean shift over Riemannian manifolds. Int. J. Comput. Vis. 84(1), 1–20 (2009)
Teboulle, M.: A unified continuous optimization framework for center-based clustering methods. J. Mach. Learn. Res. 8, 65–102 (2007)
Turaga, P., Srivastava, A. (eds.): Riemannian Computing in Computer Vision. Springer, New York (2016)
Tuzel, O., Porikli, F., Meer, P.: Region Covariance: A Fast Descriptor for Detection and Classification. In: Proc. ECCV, pp. 589–600. Springer (2006)
Zeilmann, A., Savarino, F., Petra, S., Schnörr, C.: Geometric Numerical Integration of the Assignment Flow. Inverse Probl. (2019). https://doi.org/10.1088/1361-6420/ab2772
Zern, A., Zisler, M., Åström, F., Petra, S., Schnörr, C.: Unsupervised Label Learning on Manifolds by Spatially Regularized Geometric Assignment. In: Proc. GCPR (2018)
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This work is supported by Deutsche Forschungsgemeinschaft (DFG) under Germanys Excellence Strategy EXC-2181/1 - 390900948 (the Heidelberg STRUCTURES Excellence Cluster) and by the Research Training Group funded by the DFG, Grant GRK 1653.
Rights and permissions
About this article
Cite this article
Zern, A., Zisler, M., Petra, S. et al. Unsupervised Assignment Flow: Label Learning on Feature Manifolds by Spatially Regularized Geometric Assignment. J Math Imaging Vis 62, 982–1006 (2020). https://doi.org/10.1007/s10851-019-00935-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10851-019-00935-7