Skip to main content
Log in

Unsupervised Assignment Flow: Label Learning on Feature Manifolds by Spatially Regularized Geometric Assignment

  • Published:
Journal of Mathematical Imaging and Vision Aims and scope Submit manuscript

Abstract

This paper introduces the unsupervised assignment flow that couples the assignment flow for supervised image labeling (Åström et al. in J Math Imaging Vis 58(2):211–238, 2017) with Riemannian gradient flows for label evolution on feature manifolds. The latter component of the approach encompasses extensions of state-of-the-art clustering approaches to manifold-valued data. Coupling label evolution with the spatially regularized assignment flow induces a sparsifying effect that enables to learn compact label dictionaries in an unsupervised manner. Our approach alleviates the requirement for supervised labeling to have proper labels at hand, because an initial set of labels can evolve and adapt to better values while being assigned to given data. The separation between feature and assignment manifolds enables the flexible application which is demonstrated for three scenarios with manifold-valued features. Experiments demonstrate a beneficial effect in both directions: adaptivity of labels improves image labeling, and steering label evolution by spatially regularized assignments leads to proper labels, because the assignment flow for supervised labeling is exactly used without any approximation for label learning.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

Notes

  1. The maps \(\widehat{g}\) and \(\widehat{g}^{-1}\) are sometimes denoted with \(\flat \) and \(\sharp \) in the literature (‘musical isomorphism’). We stick to the notation from [27] here.

  2. The symbol ‘k’ is commonly used in the literature. We prefer in this paper, however, the more specific symbol J as index set for prototypes and use k (like ij, etc.) as free index.

  3. Note that the symbol \(\exp _{p}\) does not contain the subscript \(\mathcal {S}\) in order to distinguish it from the definition (2.5) for general manifolds \(\mathcal {M}\).

References

  1. Amari, S.I., Cichocki, A.: Information geometry of divergence functions. Bull. Pol. Acad. Sci. Tech. 58(1), 183–195 (2010)

    Google Scholar 

  2. Amari, S.I., Nagaoka, H.: Methods of Information Geometry. Amer. Math. Soc. and Oxford Univ. Press, Providence (2000)

    MATH  Google Scholar 

  3. Åström, F., Petra, S., Schmitzer, B., Schnörr, C.: Image labeling by assignment. J. Math. Imaging Vis. 58(2), 211–238 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  4. Ay, N., Jost, J., Lê, H.V., Schwachhöfer, L.: Information Geometry. Springer (2017)

  5. Banerjee, A., Merugu, S., Dhillon, I.S., Ghosh, J.: Clustering with Bregman divergences. J. Mach. Learn. Res. 6, 1705–1749 (2005)

    MathSciNet  MATH  Google Scholar 

  6. Barndorff-Nielsen, O.E.: Information and Exponential Families in Statistical Theory. Wiley, Chichester (1978)

    MATH  Google Scholar 

  7. Basseville, M.: Divergence measures for statistical data processing—an annotated bibliography. Signal Proc. 93(4), 621–633 (2013)

    Article  MathSciNet  Google Scholar 

  8. Batagelj, V.: Generalized ward and related clustering problems. In: Classification and Related Methods of Data Analysis, pp. 67–74 (1988)

  9. Bauschke, H.H., Borwein, J.M.: Legendre functions and the method of random Bregman projections. J. Convex Anal. 4(1), 27–67 (1997)

    MathSciNet  MATH  Google Scholar 

  10. Bergmann, R., Fitschen, J.H., Persch, J., Steidl, G.: Iterative multiplicative filters for data labeling. Int. J. Comput. Vis. 123(3), 435–453 (2017)

    Article  MathSciNet  Google Scholar 

  11. Bhatia, R.: Positive Definite Matrices. Princeton Univ. Press, Princeton (2006)

    MATH  Google Scholar 

  12. Bonnans, J.F., Shapiro, A.: Perturbation Analysis of Optimization Problems. Springer, New York (2013)

    MATH  Google Scholar 

  13. Censor, Y.A., Zenios, S.A.: Parallel Optimization: Theory, Algorithms, and Applications. Oxford Univ. Press, New York (1997)

    MATH  Google Scholar 

  14. Cherian, A., Sra, S.: Positive definite matrices: data representation and applications to computer vision. In: Minh, H., Murino, V. (eds.) Algorithmic Advances in Riemannian Geometry and Applications, pp. 93–114. Springer, Cham (2016)

    Chapter  MATH  Google Scholar 

  15. Cherian, A., Sra, S., Banerjee, A., Papanikolopoulos, N.: Jensen-Bregman LogDet Divergence with Application to Efficient Similarity Search for Covariance Matrices. IEEE PAMI 35(9), 2161–2174 (2013)

    Article  Google Scholar 

  16. Comaniciu, D., Meer, P.: Mean shift: a robust approach toward feature space analysis. IEEE Trans. Pattern Anal. Mach. Intell. 24(5), 603–619 (2002)

    Article  Google Scholar 

  17. Fukunaga, K., Hostetler, L.: The estimation of the gradient of a density function, with applications in pattern recognition. IEEE Trans. Inform. Theory 21(1), 32–40 (1975)

    Article  MathSciNet  MATH  Google Scholar 

  18. Har-Peled, S.: Geometric Approximation Algorithms. AMS, Providence (2011)

    Book  MATH  Google Scholar 

  19. Harandi, M., Hartley, R., Lovell, B., Sanderson, C.: Sparse coding on symmetric positive definite manifolds using Bregman divergences. IEEE Trans. Neural Netw. Learn. Syst. 27(6), 1294–1306 (2016)

    Article  Google Scholar 

  20. Higham, N.: Functions of Matrices: Theory and Computation. SIAM, Philadelphia (2008)

    Book  MATH  Google Scholar 

  21. Hofmann, T., Schölkopf, B., Smola, A.J.: Kernel methods in machine learning. Ann. Stat. 36(3), 1171–1220 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  22. Hühnerbein, R., Savarino, F., Petra, S., Schnörr, C.: Learning adaptive regularization for image labeling using geometric assignment. In: Proc. SSVM. Springer (2019)

  23. Jost, J.: Riemannian Geometry and Geometric Analysis, 7th edn. Springer-Verlag, Berlin Heidelberg (2017)

    Book  MATH  Google Scholar 

  24. Kappes, J., Andres, B., Hamprecht, F., Schnörr, C., Nowozin, S., Batra, D., Kim, S., Kausler, B., Kröger, T., Lellmann, J., Komodakis, N., Savchynskyy, B., Rother, C.: A comparative study of modern inference techniques for structured discrete energy minimization problems. Int. J. Comput. Vis. 115(2), 155–184 (2015)

    Article  MathSciNet  Google Scholar 

  25. Karcher, H.: Riemannian center of mass and mollifier smoothing. Commun. Pure Appl. Math. 30, 509–541 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  26. Kleefeld, A., Meyer-Baese, A., Burgeth, B.: Elementary morphology for SO(2)-and SO(3)-orientation fields. In: International Symposium on Mathematical Morphology and Its Applications to Signal and Image Processing, pp. 458–469. Springer (2015)

  27. Lee, J.M.: Introduction to Smooth Manifolds. Springer, New York (2013)

    MATH  Google Scholar 

  28. McLachlan, G., Peel, D.: Finite Mixture Models. Wiley, New York (2000)

    Book  MATH  Google Scholar 

  29. Müllner, D.: Modern Hierarchical, Agglomerative Clustering Algorithms. arXiv preprint arXiv:1109.2378 (2011)

  30. Rockafellar, R.T., Wets, R.J.B.: Variational Analysis, 3rd edn. Springer, New York (2009)

    MATH  Google Scholar 

  31. Schnörr, C.: Assignment flows. In: P. Grohs, M. Holler, A. Weinmann (eds.) Variational Methods for Nonlinear Geometric Data and Applications. Springer (in press) (2019)

  32. Sra, S.: Positive Definite Matrices and the Symmetric Stein Divergence. CoRR arXiv:1110.1773 (2013)

  33. Subbarao, R., Meer, P.: Nonlinear mean shift over Riemannian manifolds. Int. J. Comput. Vis. 84(1), 1–20 (2009)

    Article  Google Scholar 

  34. Teboulle, M.: A unified continuous optimization framework for center-based clustering methods. J. Mach. Learn. Res. 8, 65–102 (2007)

    MathSciNet  MATH  Google Scholar 

  35. Turaga, P., Srivastava, A. (eds.): Riemannian Computing in Computer Vision. Springer, New York (2016)

    MATH  Google Scholar 

  36. Tuzel, O., Porikli, F., Meer, P.: Region Covariance: A Fast Descriptor for Detection and Classification. In: Proc. ECCV, pp. 589–600. Springer (2006)

  37. Zeilmann, A., Savarino, F., Petra, S., Schnörr, C.: Geometric Numerical Integration of the Assignment Flow. Inverse Probl. (2019). https://doi.org/10.1088/1361-6420/ab2772

  38. Zern, A., Zisler, M., Åström, F., Petra, S., Schnörr, C.: Unsupervised Label Learning on Manifolds by Spatially Regularized Geometric Assignment. In: Proc. GCPR (2018)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Artjom Zern.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work is supported by Deutsche Forschungsgemeinschaft (DFG) under Germanys Excellence Strategy EXC-2181/1 - 390900948 (the Heidelberg STRUCTURES Excellence Cluster) and by the Research Training Group funded by the DFG, Grant GRK 1653.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zern, A., Zisler, M., Petra, S. et al. Unsupervised Assignment Flow: Label Learning on Feature Manifolds by Spatially Regularized Geometric Assignment. J Math Imaging Vis 62, 982–1006 (2020). https://doi.org/10.1007/s10851-019-00935-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10851-019-00935-7

Keywords

Navigation