1932

Abstract

Visual motion processing can be conceptually divided into two levels. In the lower level, local motion signals are detected by spatiotemporal-frequency-selective sensors and then integrated into a motion vector flow. Although the model based on V1-MT physiology provides a good computational framework for this level of processing, it needs to be updated to fully explain psychophysical findings about motion perception, including complex motion signal interactions in the spatiotemporal-frequency and space domains. In the higher level, the velocity map is interpreted. Although there are many motion interpretation processes, we highlight the recent progress in research on the perception of material (e.g., specular reflection, liquid viscosity) and on animacy perception. We then consider possible linking mechanisms of the two levels and propose intrinsic flow decomposition as the key problem. To provide insights into computational mechanisms of motion perception, in addition to psychophysics and neurosciences, we review machine vision studies seeking to solve similar problems.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-vision-091517-034328
2018-09-15
2024-04-24
Loading full text...

Full text loading...

/deliver/fulltext/vision/4/1/annurev-vision-091517-034328.html?itemId=/content/journals/10.1146/annurev-vision-091517-034328&mimeType=html&fmt=ahah

Literature Cited

  1. Aaen-Stockdale C, Ledgeway T, McGraw P, Hess RF 2012. Interaction of first- and second-order signals in the extraction of global-motion and optic-flow. Vis. Res. 68:28–39
    [Google Scholar]
  2. Adato Y, Vasilyev Y, Zickler T, Ben-Shahar O 2010. Shape from specular flow. IEEE Trans. Pattern Anal. Mach. Intell. 32:2054–70
    [Google Scholar]
  3. Adelson EH 2001. On seeing stuff: the perception of materials by humans and machines. Proc. SPIE 4299:1–12
    [Google Scholar]
  4. Adelson EH, Bergen JR 1985. Spatiotemporal energy models for the perception of motion. J. Opt. Soc. Am. A 2:284–99
    [Google Scholar]
  5. Adelson EH, Movshon JA 1982. Phenomenal coherence of moving visual patterns. Nature 300:523–25
    [Google Scholar]
  6. Albright TD, Masland R 2008. The Senses: A Comprehensive Reference, Vol. 2: Vision San Diego, CA: Academic
    [Google Scholar]
  7. Amano K, Edwards M, Badcock DR, Nishida S 2009.a Adaptive pooling of visual motion signals by the human visual system revealed with a novel multi-element stimulus. J. Vis. 9:34
    [Google Scholar]
  8. Amano K, Edwards M, Badcock DR, Nishida S 2009.b Spatial-frequency tuning in the pooling of one- and two-dimensional motion signals. Vis. Res. 49:2862–69
    [Google Scholar]
  9. Amano K, Takeda T, Haji T, Terao M, Maruya K et al. 2012. Human neural responses involved in spatial pooling of locally ambiguous motion signals. J. Neurophysiol. 107:3493–508
    [Google Scholar]
  10. Anderson BL 2011. Visual perception of materials and surfaces. Curr. Biol. 21:R978–83
    [Google Scholar]
  11. Barrow H, Tenenbaum J 1978. Recovering intrinsic scene characteristics from images. Computer Vision Systems A Hanson, EM Riseman 3–26 New York: Acad. Press
    [Google Scholar]
  12. Bates C, Yildirim I, Tenenbaum JB, Battaglia P 2015. Humans predict liquid dynamics using probabilistic simulation. Proceedings of the 37th Annual Conference of the Cognitive Science Society172–77 Austin, TX: Cogn. Sci. Soc.
    [Google Scholar]
  13. Bi W, Xiao B 2016. Perceptual constancy of mechanical properties of cloth under variation of external forces. Proceedings of the ACM Symposium on Applied Perception19–23 New York: ACM
    [Google Scholar]
  14. Blake R, Shiffrar M 2007. Perception of human motion. Annu. Rev. Psychol. 58:47–73
    [Google Scholar]
  15. Braddick OJ 1974. A short-range process in apparent motion. Vis. Res. 14:519–27
    [Google Scholar]
  16. Bradley DC, Goyal MS 2008. Velocity computation in the primate visual system. Nat. Rev. Neurosci. 9:686–95
    [Google Scholar]
  17. Burr D, Thompson P 2011. Motion psychophysics: 1985–2010. Vis. Res. 51:1431–56
    [Google Scholar]
  18. Cassanello CR, Edwards M, Badcock DR, Nishida S 2011. No interaction of first- and second-order signals in the extraction of global-motion and optic-flow. Vis. Res. 51:352–61
    [Google Scholar]
  19. Cavanagh P 1992. Attention-based motion perception. Science 257:1563–65
    [Google Scholar]
  20. Cavanagh P, Mather G 1989. Motion: the long and short of it. Spat. Vis. 4:103–29
    [Google Scholar]
  21. Chang DHF, Troje NF 2008. Perception of animacy and direction from local biological motion signals. J. Vis. 8:53
    [Google Scholar]
  22. Chubb C, Sperling G 1988. Drift-balanced random stimuli: a general basis for studying non-Fourier motion perception. J. Opt. Soc. Am. A 5:1986–2007
    [Google Scholar]
  23. Clark DA, Fitzgerald JE, Ales JM, Gohl DM, Silies MA et al. 2014. Flies and humans share a motion estimation strategy that exploits natural scene statistics. Nat. Neurosci. 17:296–303
    [Google Scholar]
  24. Derpanis KGP, Wildes R 2012. Spacetime texture representation and recognition based on a spatiotemporal orientation analysis. IEEE Trans. Pattern Anal. Mach. Intell. 34:1193–205
    [Google Scholar]
  25. Derrington AM, Henning GB 1987. Errors in direction-of-motion discrimination with complex stimuli. Vis. Res. 27:61–75
    [Google Scholar]
  26. Doerschner K, Fleming RW, Yilmaz O, Schrater PR, Hartung B, Kersten D 2011.a Visual motion and the perception of surface material. Curr. Biol. 21:2010–16
    [Google Scholar]
  27. Doerschner K, Kersten D, Schrater PR 2011.b Rapid classification of specular and diffuse reflection from image velocities. Pattern Recognit 44:1874–84
    [Google Scholar]
  28. Dövencioğlu DN, Ben-Shahar O, Barla P, Doerschner K 2017. Specular motion and 3D shape estimation. J. Vis. 17:63
    [Google Scholar]
  29. Edwards M, Badcock DR 1995. Global motion perception: no interaction between the first- and second-order motion pathways. Vis. Res. 35:2589–602
    [Google Scholar]
  30. Feichtenhofer C, Pinz A, Wildes RP 2016. Dynamic scene recognition with complementary spatiotemporal features. IEEE Trans. Pattern Anal. Mach. Intell. 38:2389–401
    [Google Scholar]
  31. Fetsch CR, Deangelis GC, Angelaki DE 2010. Visual-vestibular cue integration for heading perception: applications of optimal cue integration theory. Eur. J. Neurosci. 31:1721–29
    [Google Scholar]
  32. Fleming RW 2014. Visual perception of materials and their properties. Vis. Res. 94:62–75
    [Google Scholar]
  33. Fleming RW 2017. Material perception. Annu. Rev. Vis. Sci. 3:365–88
    [Google Scholar]
  34. Fujisaki W, Goda N, Motoyoshi I, Komatsu H, Nishida S 2014. Audiovisual integration in the human perception of materials. J. Vis. 14:412
    [Google Scholar]
  35. Gao T, McCarthy G, Scholl BJ 2010. The wolfpack effect: perception of animacy irresistibly influences interactive behavior. Psychol. Sci. 21:1845–53
    [Google Scholar]
  36. Gaur V, Scassellati B 2006. Which motion features induce the perception of animacy?. Proceedings of the 2006 International Conference for Development and Learning, Bloomington, IN, May 31-June 337–44 Piscataway, NJ: IEEE
    [Google Scholar]
  37. Gautama T, Van Hulle MM 2001. Function of center-surround antagonism for motion in visual area MT/V5: a modeling study. Vis. Res. 41:3917–30
    [Google Scholar]
  38. Geisler WS 1999. Motion streaks provide a spatial code for motion direction. Nature 400:65–69
    [Google Scholar]
  39. Gekas N, Meso AI, Masson GS, Mamassian P 2017. A normalization mechanism for estimating visual motion across speeds and scales. Curr. Biol. 27:1514–20.e3
    [Google Scholar]
  40. Gerhard HE, Maloney LT 2010. Detection of light transformations and concomitant changes in surface albedo. J. Vis. 10:91
    [Google Scholar]
  41. Grossberg S, Leveille J, Versace M 2011. How do object reference frames and motion vector decomposition emerge in laminar cortical circuits. Atten. Percept. Psychophys. 73:1147–70
    [Google Scholar]
  42. Han D, Keyser J 2015. Effect of appearance on perception of deformation. Proceedings of the 14th ACM SIGGRAPH/Eurographics Symposium on Computer Animation37–44 New York: ACM
    [Google Scholar]
  43. Hartung B, Kersten D 2002. Distinguishing shiny from matte. J. Vis. 2:7551
    [Google Scholar]
  44. Hayashi R, Sugita Y, Nishida S, Kawano K 2010. How motion signals are integrated across frequencies: study on motion perception and ocular following responses using multiple-slit stimuli. J. Neurophysiol. 103:230–43
    [Google Scholar]
  45. Hayashi R, Watanabe O, Yokoyama H, Nishida S 2017. A new analytical method for characterizing nonlinear visual processes with stimuli of arbitrary distribution: theory and applications. J. Vis. 17:614
    [Google Scholar]
  46. Hedges JH, Gartshteyn Y, Kohn A, Rust NC, Shadlen MN et al. 2011. Dissociation of neuronal and psychophysical responses to local and global motion. Curr. Biol. 21:2023–28
    [Google Scholar]
  47. Heider F, Simmel M 1944. An experimental study of apparent behavior. Am. J. Psychol. 57:243–59
    [Google Scholar]
  48. Hisakata R, Murakami I 2009. Illusory position shift induced by plaid motion. Vis. Res. 49:2902–10
    [Google Scholar]
  49. Hu Q, Victor JD 2010. A set of high-order spatiotemporal stimuli that elicit motion and reverse-phi percepts. J. Vis. 10:39
    [Google Scholar]
  50. Huang X, Albright TD, Stoner GR 2007. Adaptive surround modulation in cortical area MT. Neuron 53:761–70
    [Google Scholar]
  51. Jazayeri M, Movshon JA 2007. A new perceptual illusion reveals mechanisms of sensory decoding. Nature 446:912–15
    [Google Scholar]
  52. Jogan M, Stocker AA 2015. Signal integration in human visual speed perception. J. Neurosci. 35:9381–90
    [Google Scholar]
  53. Johansson G 1973. Visual perception of biological motion and a model for its analysis. Percept. Psychophys. 14:201–11
    [Google Scholar]
  54. Johnston A, Scarfe P 2013. The role of the harmonic vector average in motion integration. Front. Comput. Neurosci. 7:146
    [Google Scholar]
  55. Kawabe T 2017.a Perceiving animacy from deformation and translation. i-Perception 8:2041669517707767
    [Google Scholar]
  56. Kawabe T 2017.b What property of the contour of a deforming region biases percepts toward liquid. ? Front. Psychol. 8:1014
    [Google Scholar]
  57. Kawabe T, Fukiage T, Sawayama M, Nishida S 2016. Deformation lamps: a projection technique to make static objects perceptually dynamic. ACM Trans. Appl. Percept. 13:10
    [Google Scholar]
  58. Kawabe T, Kogovšek R 2017. Image deformation as a cue to material category judgment. Sci. Rep. 7:44274
    [Google Scholar]
  59. Kawabe T, Maruya K, Fleming RW, Nishida S 2015.a Seeing liquids from visual motion. Vis. Res. 109:125–38
    [Google Scholar]
  60. Kawabe T, Maruya K, Nishida S 2015.b Perceptual transparency from image deformation. PNAS 112:E4620–27
    [Google Scholar]
  61. Kawabe T, Nishida S 2016. Seeing jelly: judging elasticity of a transparent object. Proceedings of the ACM Symposium on Applied Perception121–28 New York: ACM
    [Google Scholar]
  62. Kawabe T, Nishida S 2017. Contour junctions defined by dynamic image deformations enhance perceptual transparency. J. Vis. 17:1315
    [Google Scholar]
  63. Kumbhani RD, El-Shamayleh Y, Movshon JA 2015. Temporal and spatial limits of pattern motion sensitivity in macaque MT neurons. J. Neurophysiol. 113:1977–88
    [Google Scholar]
  64. Lee ALF, Lu H 2010. A comparison of global motion perception using a multiple-aperture stimulus. J. Vis. 10:49
    [Google Scholar]
  65. Lee ALF, Lu H 2012. Two forms of aftereffects induced by transparent motion reveal multilevel adaptation. J. Vis. 12:43
    [Google Scholar]
  66. Lee ALF, Lu H 2014. Global-motion aftereffect does not depend on awareness of the adapting motion direction. Atten. Percept. Psychophys. 76:766–79
    [Google Scholar]
  67. Linares D, Motoyoshi I, Nishida S 2012. Surround facilitation for rapid motion perception. J. Vis. 12:103
    [Google Scholar]
  68. Linares D, Nishida S 2013. A synchronous surround increases the motion strength gain of motion. J. Vis. 13:1312
    [Google Scholar]
  69. Liu C, Sharan L, Adelson EH, Rosenholtz R 2010. Exploring features in a Bayesian framework for material recognition. 2010 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)239–46 https://ieeexplore.ieee.org/document/5540207/
    [Google Scholar]
  70. Lu H 2010. Structural processing in biological motion perception. J. Vis. 10:1213
    [Google Scholar]
  71. Lu ZL, Sperling G 1995. The functional architecture of human visual motion perception. Vis. Res. 35:2697–722
    [Google Scholar]
  72. Majaj NJ, Carandini M, Movshon JA 2007. Motion integration by neurons in macaque MT is local, not global. J. Neurosci. 27:366–70
    [Google Scholar]
  73. Mamassian P, Knill DC, Kersten D 1998. The perception of cast shadows. Trends Cogn. Sci. 2:288–95
    [Google Scholar]
  74. Marlow PJ, Anderson BL 2016. Motion and texture shape cues modulate perceived material properties. J. Vis. 16:15
    [Google Scholar]
  75. Maruya K, Amano K, Nishida S 2010. Conditional spatial-frequency selective pooling of one-dimensional motion signals into global two-dimensional motion. Vis. Res. 50:1054–64
    [Google Scholar]
  76. Maruya K, Holcombe AO, Nishida S 2013. Rapid encoding of relationships between spatially remote motion signals. J. Vis. 13:24
    [Google Scholar]
  77. Maruya K, Nishida S 2010. Spatial pooling of one-dimensional second-order motion signals. J. Vis. 10:1324
    [Google Scholar]
  78. Masuda T, Matsubara K, Utsumi K, Wada Y 2015. Material perception of a kinetic illusory object with amplitude and frequency changes in oscillated inducer motion. Vis. Res. 109:201–8
    [Google Scholar]
  79. Masuda T, Sato K, Murakoshi T, Utsumi K, Kimura A et al. 2013. Perception of elasticity in the kinetic illusory object with phase differences in inducer motion. PLOS ONE 8:e78621
    [Google Scholar]
  80. Medathati NVK, Neumann H, Masson GS, Kornprobst P 2016. Bio-inspired computer vision: towards a synergistic approach of artificial and biological vision. Comput. Vis. Image Underst. 150:1–30
    [Google Scholar]
  81. Medathati NVK, Rankin J, Meso AI, Kornprobst P, Masson GS 2017. Recurrent network dynamics reconciles visual motion segmentation and integration. Sci. Rep. 7:11270
    [Google Scholar]
  82. Michotte A 1963. The Perception of Causality Oxford, UK: Basic Books
  83. Morgenstern Y, Kersten DJ 2017. The perceptual dimensions of natural dynamic flow. J. Vis. 17:127
    [Google Scholar]
  84. Motoyoshi I, Nishida S, Sharan L, Adelson EH 2007. Image statistics and the perception of surface qualities. Nature 447:206–9
    [Google Scholar]
  85. Newsome WT, Pare EB 1988. A selective impairment of motion perception following lesions of the middle temporal visual area (MT). J. Neurosci. 8:2201–11
    [Google Scholar]
  86. Nielsen R, Vuust P, Wallentin M 2015. Perception of animacy from the motion of a single sound object. Perception 44:183–97
    [Google Scholar]
  87. Nishida S 2004. Motion-based analysis of spatial patterns by the human visual system. Curr. Biol. 14:830–39
    [Google Scholar]
  88. Nishida S 2011. Advancement of motion psychophysics: review 2001–2010. J. Vis. 11:511
    [Google Scholar]
  89. Nishida S, Sato T 1992. Positive motion after-effect induced by bandpass-filtered random-dot kinematograms. Vis. Res. 32:1635–46
    [Google Scholar]
  90. Nishida S, Watanabe J, Kuriki I, Tokimoto T 2007. Human visual system integrates color signals along a motion trajectory. Curr. Biol. 17:366–72
    [Google Scholar]
  91. Nishimoto S, Gallant JL 2011. A three-dimensional spatiotemporal receptive field model explains responses of area MT neurons to naturalistic movies. J. Neurosci. 31:14551–64
    [Google Scholar]
  92. Nitzany EI, Loe ME, Palmer SE, Victor JD 2016. Perceptual interaction of local motion signals. J. Vis. 16:1422
    [Google Scholar]
  93. Nitzany EI, Victor JD 2014. The statistics of local motion signals in naturalistic movies. J. Vis. 14:410
    [Google Scholar]
  94. Nusseck M, Lagarde J, Bardy B, Fleming R 2007. Perception and prediction of simple object interactions. Proceedings of the 4th Symposium on Applied Perception in Graphics and Visualization27–34 New York: ACM
    [Google Scholar]
  95. Pack CC, Born RT 2008. Cortical mechanisms for the integration of visual motion. See Albright & Masland 2008 189–218
  96. Paulun VC, Kawabe T, Nishida S, Fleming RW 2015. Seeing liquids from static snapshots. Vis. Res. 115:163–74
    [Google Scholar]
  97. Paulun VC, Schmidt F, van Assen JJ, Fleming RW 2017. Shape, motion, and optical cues to stiffness of elastic objects. J. Vis. 17:120
    [Google Scholar]
  98. Perrone JA, Thiele A 2001. Speed skills: measuring the visual speed analyzing properties of primate MT neurons. Nat. Neurosci. 4:526–32
    [Google Scholar]
  99. Pratt J, Radulescu PV, Guo RM, Abrams RA 2010. It's alive! Animate motion captures visual attention. Psychol. Sci. 21:1724–30
    [Google Scholar]
  100. Priebe NJ, Cassanello CR, Lisberger SG 2003. The neural representation of speed in macaque area MT/V5. J. Neurosci. 23:5650–61
    [Google Scholar]
  101. Priebe NJ, Lisberger SG, Movshon JA 2006. Tuning for spatiotemporal frequency and speed in directionally selective neurons of macaque striate cortex. J. Neurosci. 26:2941–50
    [Google Scholar]
  102. Reisbeck TE, Gegenfurtner KR 1999. Velocity tuned mechanisms in human motion processing. Vis. Res. 39:3267–85
    [Google Scholar]
  103. Rider AT, McOwan PW, Johnston A 2009. Motion-induced position shifts in global dynamic Gabor arrays. J. Vis. 9:138
    [Google Scholar]
  104. Rider AT, McOwan PW, Johnston A 2014. Asymmetric global motion integration in drifting Gabor arrays. J. Vis. 14:818
    [Google Scholar]
  105. Rider AT, Nishida S, Johnston A 2016. Multiple-stage ambiguity in motion perception reveals global computation of local motion directions. J. Vis. 16:157
    [Google Scholar]
  106. Rust NC, Mante V, Simoncelli EP, Movshon JA 2006. How MT cells analyze the motion of visual patterns. Nat. Neurosci. 9:1421–31
    [Google Scholar]
  107. Sakano Y, Ando H 2010. Effects of head motion and stereo viewing on perceived glossiness. J. Vis. 10:915
    [Google Scholar]
  108. Sayim B, Cavanagh P 2011. The art of transparency. i-Perception 2:679–96
    [Google Scholar]
  109. Scarfe P, Johnston A 2010. Motion drag induced by global motion Gabor arrays. J. Vis. 10:514
    [Google Scholar]
  110. Scarfe P, Johnston A 2011. Global motion coherence can influence the representation of ambiguous local motion. J. Vis. 11:126
    [Google Scholar]
  111. Scheller Lichtenauer M, Schuetz P, Zolliker P 2013. Interaction improves perception of gloss. J. Vis. 13:1414
    [Google Scholar]
  112. Schlottmann A, Ray E 2010. Goal attribution to schematic animals: Do 6-month-olds perceive biological motion as animate. ? Dev. Sci. 13:1–10
    [Google Scholar]
  113. Schlottmann A, Surian L 1999. Do 9-month-olds perceive causation-at-a-distance. ? Perception 28:1105–13
    [Google Scholar]
  114. Scholl BJ, Tremoulet PD 2000. Perceptual causality and animacy. Trends Cogn. Sci. 4:299–309
    [Google Scholar]
  115. Schrater PR, Knill DC, Simoncelli EP 2000. Mechanisms of visual motion detection. Nat. Neurosci. 3:64–8
    [Google Scholar]
  116. Schultz J, Bulthoff HH 2013. Parametric animacy percept evoked by a single moving dot mimicking natural stimuli. J. Vis. 13:415
    [Google Scholar]
  117. Serrano-Pedraza I, Goddard P, Derrington AM 2007. Evidence for reciprocal antagonism between motion sensors tuned to coarse and fine features. J. Vis. 7:128
    [Google Scholar]
  118. Simoncelli EP, Heeger DJ 1998. A model of neuronal responses in visual area MT. Vis. Res. 38:743–61
    [Google Scholar]
  119. Simoncini C, Perrinet LU, Montagnini A, Mamassian P, Masson GS 2012. More is not always better: Adaptive gain control explains dissociation between perception and action. Nat. Neurosci. 15:1596–603
    [Google Scholar]
  120. Simonyan K, Zisserman A 2014. Two-stream convolutional networks for action recognition in videos. Advances in Neural Information Processing Systems 27568–76 Red Hook, NY: Curran Assoc.
    [Google Scholar]
  121. Solari F, Chessa M, Medathati NVK, Kornprobst P 2015. What can we expect from a V1-MT feedforward architecture for optical flow estimation. ? Signal Process. Image Commun. 39:342–54
    [Google Scholar]
  122. Spering M, Montagnini A 2011. Do we track what we see? Common versus independent processing for motion perception and smooth pursuit eye movements: a review. Vis. Res. 51:836–52
    [Google Scholar]
  123. Takahashi K, Watanabe K 2015. Synchronous motion modulates animacy perception. J. Vis. 15:817
    [Google Scholar]
  124. Tamura H, Higashi H, Nakauchi S 2017. Multiple cues for visual perception of mirror and glass materials. J. Vis. 17:10765
    [Google Scholar]
  125. Tanaka K, Saito H 1989. Analysis of motion of the visual field by direction, expansion/contraction, and rotation cells clustered in the dorsal part of the medial superior temporal area of the macaque monkey. J. Neurophysiol. 62:626–41
    [Google Scholar]
  126. Tang MF, Dickinson JE, Visser TA, Edwards M, Badcock DR 2015. Role of form information in motion pooling and segmentation. J. Vis. 15:1519
    [Google Scholar]
  127. Teney D, Hebert M 2017. Learning to extract motion from videos in convolutional neural networks. Computer Vision—ACCV 2016: 13th Asian Conference on Computer Vision, Taipei, Taiwan, November 20–24, 2016, Revised Selected Papers, Part V S-H Lai, V Lepetit, K Nishino, Y Sato 412–28 Cham, Switz.: Springer Int. Publ.
    [Google Scholar]
  128. Terao M, Murakami I, Nishida S 2015. Enhancement of motion perception in the direction opposite to smooth pursuit eye movement. J. Vis. 15:132
    [Google Scholar]
  129. Thurman SM, Lu H 2013.a Complex interactions between spatial, orientation, and motion cues for biological motion perception across visual space. J. Vis. 13:28
    [Google Scholar]
  130. Thurman SM, Lu H 2013.b Physical and biological constraints govern perceived animacy of scrambled human forms. Psychol. Sci. 24:1133–41
    [Google Scholar]
  131. Tlapale E, Masson GS, Kornprobst P 2010. Modelling the dynamics of motion integration with a new luminance-gated diffusion mechanism. Vis. Res. 50:1676–92
    [Google Scholar]
  132. Todd JT 2005. The visual perception of 3-dimensional structure from motion. Perception of Space and Motion W Epstein, SJ Rogers 201–26 Orlando, FL: Academic
    [Google Scholar]
  133. Träuble B, Pauen S, Poulin-Dubois D 2014. Speed and direction changes induce the perception of animacy in 7-month-old infants. Front. Psychol. 5:1141
    [Google Scholar]
  134. Tremoulet PD, Feldman J 2000. Perception of animacy from the motion of a single object. Perception 29:943–51
    [Google Scholar]
  135. Troje NF 2008. Biological motion perception. See Albright & Masland 2008 231–38
  136. Troje NF 2013. What is biological motion? Definition, stimuli, and paradigms. Social Perception: Detection and Interpretation of Animacy, Agency, and Intention MD Rutherford, VA Kuhlmeier 13–36 Cambridge, MA: MIT Press
    [Google Scholar]
  137. Troje NF, Chang DHF 2013. Shape-independent processing of biological motion. People Watching: Social, Perceptual, and Neurophysiological Studies of Body Perception K Johnson, M Shiffrar 82–100 Oxford, UK: Oxford Univ. Press
    [Google Scholar]
  138. van Assen JJ, Fleming RW 2016. Influence of optical material properties on the perception of liquids. J. Vis. 16:1512
    [Google Scholar]
  139. van Santen JP, Sperling G 1985. Elaborated Reichardt detectors. J. Opt. Soc. Am. A 2:300–21
    [Google Scholar]
  140. Wang H, Schmid C 2013. Action recognition with improved trajectories. 2013 IEEE International Conference on Computer Vision (ICCV)3551–58 Los Alamitos, CA: IEEE
    [Google Scholar]
  141. Wang JYA, Adelson EH 1993. Layered representation for motion analysis. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition361–66 https://ieeexplore.ieee.org/document/341105/
    [Google Scholar]
  142. Warren PA, Rushton SK 2009. Optic flow processing for the assessment of object movement during ego movement. Curr. Biol. 19:1555–60
    [Google Scholar]
  143. Warren WH 2008. Optic flow. See Albright & Masland 2008 219–30
  144. Warren WH Jr., Kim EE, Husney R 1987. The way the ball bounces: visual and auditory perception of elasticity and control of the bounce pass. Perception 16:309–36
    [Google Scholar]
  145. Watson AB, Ahumada AJ 1985. Model of human visual-motion sensing. J. Opt. Soc. Am. A 2:322–41
    [Google Scholar]
  146. Webb BS, Ledgeway T, McGraw PV 2007. Cortical pooling algorithms for judging global motion direction. PNAS 104:3532–37
    [Google Scholar]
  147. Wendt G, Faul F, Ekroll V, Mausfeld R 2010. Disparity, motion, and color information improve gloss constancy performance. J. Vis. 10:97
    [Google Scholar]
  148. Xiao B, Bi W, Jia X, Wei H, Adelson EH 2016. Can you see what you feel? Color and folding properties affect visual-tactile material discrimination of fabrics. J. Vis. 16:334
    [Google Scholar]
  149. Xiao J, Huang X 2015. Distributed and dynamic neural encoding of multiple motion directions of transparently moving stimuli in cortical area MT. J. Neurosci. 35:16180–98
    [Google Scholar]
  150. Xue T, Rubinstein M, Liu C, Freeman WT 2015. A computational approach for obstruction-free photography. ACM Trans. Graph. 34:79
    [Google Scholar]
  151. Yilmaz O, Doerschner K 2014. Detection and localization of specular surfaces using image motion cues. Mach. Vis. Appl. 25:1333–49
    [Google Scholar]
/content/journals/10.1146/annurev-vision-091517-034328
Loading
/content/journals/10.1146/annurev-vision-091517-034328
Loading

Data & Media loading...

Supplemental Material

Supplementary Data

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error