Abstract
It is known that eye movements during object imagery reflect areas visited during encoding. But will eye movements also reflect pictorial low-level features of imagined stimuli? In this paper, three experiments are reported in which we investigate whether low-level properties of mental images elicit specific eye movements. Based on the conceptualization of mental images as depictive representations, we expected low-level visual features to influence eye fixations during mental imagery, in the absence of a visual input. In a first experiment, twenty-five participants performed a visual imagery task with high vs. low spatial frequency and high vs. low contrast gratings. We found that both during visual perception and during mental imagery, first fixations were more often allocated to the low spatial frequency–high contrast grating, thus showing that eye fixations were influenced not only by physical properties of visual stimuli but also by its imagined counterpart. In a second experiment, twenty-two participants imagined high contrast and low contrast stimuli that they had not encoded before. Again, participants allocated more fixations to the high contrast mental images than to the low contrast mental images. In a third experiment, we ruled out task difficulty as confounding variable. Our results reveal that low-level visual features are represented in the mind’s eye and thus, they contribute to the characterization of mental images in terms of how much perceptual information is re-instantiated during mental imagery.
Similar content being viewed by others
Data availabilty statement
The datasets analyzed during the current studies are avaliable from Figsahre (https://figshare.com/s/e64c369ef8a5dbb2ed27).
References
Albers, A. M., Kok, P., Toni, I., Dijkerman, H. C., & de Lange, F. P. (2013). Shared representations for working memory and mental imagery in early visual cortex. Current Biology, 23(15), 1427–1431.
Albright, T. D. (2012). On the perception of probable things: Neural substrates of associative memory, imagery, and perception. Neuron, 74(2), 227–245.
Altmann, G. T. (2004). Language-mediated eye movements in the absence of a visual world: The “blank screen paradigm.” Cognition, 93(2), B79-87.
Baddeley, R. J., & Tatler, B. W. (2006). High frequency edges (but not contrast) predict where we fixate: A Bayesian system identification analysis. Vision Research, 46(18), 2824–2833.
Bone, M. B., St-Laurent, M., Dang, C., McQuiggan, D. A., & Ryan, J. D. B. (2019). Eye movement reinstatement and neural reactivation during mental imagery. Cerebral Cortex, 29(3), 1075–1089.
Borji, A., Sihite, D. N., & Itti, L. (2013). What stands out in a scene? A study of human explicit saliency judgment. Vision Research, 91, 62–67.
Brandt, S. A., & Stark, L. W. (1997). Spontaneous eye movements during visual imagery reflect the content of the visual scene. Journal of Cognitive Neuroscience, 9(1), 27–38.
Broggin, E., Savazzi, S., & Marzi, C. A. (2012). Similar effects of visual perception and imagery on simple reaction time. The Quarterly Journal of Experimental Psychology, 65(1), 151–164.
Chiquet, S., Martarelli, C.S., & Mast, F.W. (2020). Eye movements to absent objects during mental imagery and visual memory in immersive virtual reality. Virtual Reality.
Dijkstra, N., Bosch, S. E., & van Gerven, M. A. J. (2019). Shared neural mechanisms of visual perception and imagery. Trends in Cognitive Sciences, 23(5), 423–434.
Einhäuser, W., & König, P. (2003). Does luminance-contrast contribute to a saliency map for overt visual attention? European Journal of Neuroscience, 17(5), 1089–1097.
Einhäuser, W., Spain, M., & Perona, P. (2008). Objects predict fixations better than early saliency. Journal of Vision, 8(14), 1–26.
Faul, F., Erdfelder, E., Lang, A. G., & Buchner, A. (2007). G* Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39, 175–191.
Fourtassi, M., Hajjioui, A., Urquizar, C., Rossetti, Y., Rode, G., & Pisella, L. (2013). Iterative fragmentation of cognitive maps in a visual imagery task. PLoS ONE, 8(7), e68560.
Ganis, G., Thompson, W. L., & Kosslyn, S. M. (2004). Brain areas underlying visual mental imagery and visual perception: An fMRI study. Cognitive Brain Research, 20(2), 226–241.
Henderson, J. M., Weeks, P. A. J., & Hollingworth, A. (1999). The effects of semantic consistency on eye movements during complex scene viewing. Journal of Experimental Psychology: Human Perception and Performance, 25(1), 210–228.
Itti, L. (2005). Quantifying the contribution of low-level saliency to human eye movements in dynamic scenes. Visual Cognition, 12(6), 1093–1123.
Itti, L., & Borji, A. (2014). Computational models: bottom-up and top-down aspects. In A. C. Nobre & S. Kastner (Eds.), The Oxford Handbook of Attention (pp. 1122–1158). Oxford University Press.
Itti, L., & Koch, C. (2000). A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Research, 40(10–12), 1489–1506.
Itti, L., & Koch, C. (2001). Computational modeling of visual attention. Nature Reviews Neuroscience, 2(3), 194–203.
Johansson, R., Holsanova, J., Dewhurst, R., & Holmqvist, K. (2012). Eye movements during scene recollection have a functional role, but they are not reinstatements of those produced during encoding. Journal of Experimental Psychology: Human Perception and Performance, 38(5), 1289–1314.
Johansson, R., Holsanova, J., & Holmqvist, K. (2006). Pictures and spoken descriptions elicit similar eye movements during mental imagery, both in light and in complete darkness. Cognitive Science, 30(6), 1053–1079.
Johansson, R., & Johansson, M. (2014). Look here, eye movements play a functional role in memory retrieval. Psychological Science, 25(1), 236–242.
Johnson, M. R., & Johnson, M. K. (2014). Decoding individual natural scene representations during perception and imagery. Frontiers in Human Neuroscience, 8, 59.
Kienzle, W., Franz, M. O., Schölkopf, B., & Wichmann, F. A. (2009). Center-surround patterns emerge as optimal predictors for human saccade targets. Journal of Vision, 9(5), 1–15.
Koch, C., & Ullman, S. (1985). Shifts in selective visual attention: Towards the underlying neural circuitry. Human Neurobiology, 4, 219–227.
Kosslyn, S. M., Sukel, K. E., & Bly, B. M. (1999). Squinting with the mind’s eye: Effects of stimulus resolution on imaginal and perceptual comparisons. Memory and Cognition, 27(2), 276–287.
Kosslyn, S. M., Thompson, W. L., & Ganis, G. (2006). The Case for Mental Imagery. Oxford University Press.
Laeng, B., Bloem, I. M., D’Ascenzo, S., & Tommasi, L. (2014). Scrutinizing visual images: The role of gaze in mental imagery and memory. Cognition, 131(2), 263–283.
Laeng, B., & Sulutvedt, U. (2014). The eye pupil adjusts to imaginary light. Psychological Science, 25(1), 188–197.
Laeng, B., & Teodorescu, D.-S. (2002). Eye scanpaths during visual imagery reenact those of perception of the same visual scene. Cognitive Science, 26(2), 207–231.
Lauwereyns, J. (2012). Brain and the Gaze. On the Active Boundaries of Vision. The MIT Press.
Le Meur, O., Le Callet, P., & Barba, D. (2007). Predicting visual fixations on video based on low-level visual features. Vision Research, 47(19), 2483–2498.
Lee, S. H., Kravitz, D. J., & Baker, C. I. (2012). Disentangling visual imagery and perception of real-world objects. NeuroImage, 59(4), 4064–4073.
Mannan, S. K., Ruddock, K. H., & Wooding, D. S. (1996). The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images. Spatial Vision, 10(3), 165–188.
Martarelli, C. S., Chiquet, S., Laeng, B., & Mast, F. W. (2017). Using space to represent categories: Insights from gaze position. Psychological Research, 81(4), 721–729.
Martarelli, C. S., & Mast, F. W. (2011). Preschool children’s eye-movements during pictorial recall. British Journal of Developmental Psychology, 29, 425–436.
Martarelli, C. S., & Mast, F. W. (2013). Eye movements during long-term pictorial recall. Psychological Research, 77(3), 303–309.
Naselaris, T., Olman, C. A., Stansbury, D. E., Ugurbil, K., & Gallant, J. L. (2015). A voxel- wise encoding model for early visual areas decodes mental images of remembered scenes. NeuroImage, 105, 215–228.
Parkhurst, D., Law, K., & Niebur, E. (2002). Modeling the role of salience in the allocation of overt visual attention. Vision Research, 42(1), 107–123.
Pearson, J., Naselaris, T., Holmes, E. A., & Kosslyn, S. M. (2015). Mental imagery: Functional Mechanisms and clinical applications. Trends in Cognitive Sciences, 19(10), 590–602.
Richardson, D. C., & Spivey, M. J. (2000). Representation, space and Hollywood Squares: Looking at things that aren’t there anymore. Cognition, 76(3), 269–295.
Rouw, R., Kosslyn, S. M., & Hamel, R. (1997). Detecting high-level and low-level properties in visual images and visual percepts. Cognition, 63(2), 209–226.
Scholz, A., Klichowicz, A., & Krems, J. F. (2018). Covert shifts of attention can account for the functional role of “eye movements to nothing.” Memory & Cognition, 46(2), 230–243.
Scholz, A., Mehlhorn, K., & Krems, J. F. (2016). Listen up, eye movements play a role in verbal memory retrieval. Psychological Research, 80(1), 149–158.
Schütz, A. C., Braun, D. I., & Gegenfurtner, K. R. (2011). Eye movements and perception: A selective review. Journal of Vision, 11(5), 1–30.
Slotnick, S. D., Thompson, W. L., & Kosslyn, S. M. (2005). Visual mental imagery induces retinotopically organized activation of early visual areas. Cerebral Cortex, 15(10), 1570–1583.
Spivey, M. J., & Geng, J. J. (2001). Oculomotor mechanisms activated by imagery and memory: Eye movements to absent objects. Psychological Research, 65(4), 235–241.
Stokes, M., Thompson, R., Cusack, R., & Duncan, J. (2009). Top-down activation of shape-specific population codes in visual cortex during mental imagery. Journal of Neuroscience, 29(5), 1565–1572.
Tatler, B. W., Baddeley, R. J., & Gilchrist, I. D. (2005). Visual correlates of fixation selection: Effects of scale and time. Vision Research, 45(5), 643–659.
Umar, H., Mast, F. W., Cacchione, T., & Martarelli, C. S. (2021). The prioritization of visuo-spatial associations during mental imagery. Cognitive Processing. https://doi.org/10.1007/s10339-020-01010-5.
Wantz, A. L., Martarelli, C. S., & Mast, F. W. (2016). When looking back to nothing goes back to nothing. Cognitive Processing, 17(1), 105–114.
Willenbockel, V., Sadr, J., Fiset, D., Horne, G. O., Gosselin, F., & Tanaka, J. W. (2010). Controlling low-level image properties: The SHINE toolbox. Behavior and Research Methods, 42(3), 671–684.
Yarbus, A. L. (1967). Eye Movements and Vision. Plenum Press.
Acknowledgments
We thank Matthias Hartmann, Felix Wichmann, the editor and two anonymous reviewers for useful comments on a previous version of the manuscript.
Funding
The authors received no specific funding for this work.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Ethical approval
All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. Informed consent was obtained from all individual participants included in the study. The study (including experiments 1 and 2) was approved by the faculty ethics committee (2013-2-333838, "Augenbewegungen und Vorstellungsprozesse", Ethikkommission der philosophisch-humanwissenschaftliche Fakultät, Bern Universität).
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Martarelli, C.S., Mast, F.W. Pictorial low-level features in mental images: evidence from eye fixations. Psychological Research 86, 350–363 (2022). https://doi.org/10.1007/s00426-021-01497-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00426-021-01497-3