Skip to main content
Log in

Pictorial low-level features in mental images: evidence from eye fixations

  • Original Article
  • Published:
Psychological Research Aims and scope Submit manuscript

Abstract

It is known that eye movements during object imagery reflect areas visited during encoding. But will eye movements also reflect pictorial low-level features of imagined stimuli? In this paper, three experiments are reported in which we investigate whether low-level properties of mental images elicit specific eye movements. Based on the conceptualization of mental images as depictive representations, we expected low-level visual features to influence eye fixations during mental imagery, in the absence of a visual input. In a first experiment, twenty-five participants performed a visual imagery task with high vs. low spatial frequency and high vs. low contrast gratings. We found that both during visual perception and during mental imagery, first fixations were more often allocated to the low spatial frequency–high contrast grating, thus showing that eye fixations were influenced not only by physical properties of visual stimuli but also by its imagined counterpart. In a second experiment, twenty-two participants imagined high contrast and low contrast stimuli that they had not encoded before. Again, participants allocated more fixations to the high contrast mental images than to the low contrast mental images. In a third experiment, we ruled out task difficulty as confounding variable. Our results reveal that low-level visual features are represented in the mind’s eye and thus, they contribute to the characterization of mental images in terms of how much perceptual information is re-instantiated during mental imagery.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Data availabilty statement

The datasets analyzed during the current studies are avaliable from Figsahre (https://figshare.com/s/e64c369ef8a5dbb2ed27).

References

  • Albers, A. M., Kok, P., Toni, I., Dijkerman, H. C., & de Lange, F. P. (2013). Shared representations for working memory and mental imagery in early visual cortex. Current Biology, 23(15), 1427–1431.

    PubMed  Google Scholar 

  • Albright, T. D. (2012). On the perception of probable things: Neural substrates of associative memory, imagery, and perception. Neuron, 74(2), 227–245.

    PubMed  PubMed Central  Google Scholar 

  • Altmann, G. T. (2004). Language-mediated eye movements in the absence of a visual world: The “blank screen paradigm.” Cognition, 93(2), B79-87.

    PubMed  Google Scholar 

  • Baddeley, R. J., & Tatler, B. W. (2006). High frequency edges (but not contrast) predict where we fixate: A Bayesian system identification analysis. Vision Research, 46(18), 2824–2833.

    PubMed  Google Scholar 

  • Bone, M. B., St-Laurent, M., Dang, C., McQuiggan, D. A., & Ryan, J. D. B. (2019). Eye movement reinstatement and neural reactivation during mental imagery. Cerebral Cortex, 29(3), 1075–1089.

    PubMed  Google Scholar 

  • Borji, A., Sihite, D. N., & Itti, L. (2013). What stands out in a scene? A study of human explicit saliency judgment. Vision Research, 91, 62–67.

    PubMed  Google Scholar 

  • Brandt, S. A., & Stark, L. W. (1997). Spontaneous eye movements during visual imagery reflect the content of the visual scene. Journal of Cognitive Neuroscience, 9(1), 27–38.

    PubMed  Google Scholar 

  • Broggin, E., Savazzi, S., & Marzi, C. A. (2012). Similar effects of visual perception and imagery on simple reaction time. The Quarterly Journal of Experimental Psychology, 65(1), 151–164.

    PubMed  Google Scholar 

  • Chiquet, S., Martarelli, C.S., & Mast, F.W. (2020). Eye movements to absent objects during mental imagery and visual memory in immersive virtual reality. Virtual Reality.

  • Dijkstra, N., Bosch, S. E., & van Gerven, M. A. J. (2019). Shared neural mechanisms of visual perception and imagery. Trends in Cognitive Sciences, 23(5), 423–434.

    PubMed  Google Scholar 

  • Einhäuser, W., & König, P. (2003). Does luminance-contrast contribute to a saliency map for overt visual attention? European Journal of Neuroscience, 17(5), 1089–1097.

    Google Scholar 

  • Einhäuser, W., Spain, M., & Perona, P. (2008). Objects predict fixations better than early saliency. Journal of Vision, 8(14), 1–26.

    PubMed  Google Scholar 

  • Faul, F., Erdfelder, E., Lang, A. G., & Buchner, A. (2007). G* Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39, 175–191.

    PubMed  Google Scholar 

  • Fourtassi, M., Hajjioui, A., Urquizar, C., Rossetti, Y., Rode, G., & Pisella, L. (2013). Iterative fragmentation of cognitive maps in a visual imagery task. PLoS ONE, 8(7), e68560.

    PubMed  PubMed Central  Google Scholar 

  • Ganis, G., Thompson, W. L., & Kosslyn, S. M. (2004). Brain areas underlying visual mental imagery and visual perception: An fMRI study. Cognitive Brain Research, 20(2), 226–241.

    PubMed  Google Scholar 

  • Henderson, J. M., Weeks, P. A. J., & Hollingworth, A. (1999). The effects of semantic consistency on eye movements during complex scene viewing. Journal of Experimental Psychology: Human Perception and Performance, 25(1), 210–228.

    Google Scholar 

  • Itti, L. (2005). Quantifying the contribution of low-level saliency to human eye movements in dynamic scenes. Visual Cognition, 12(6), 1093–1123.

    Google Scholar 

  • Itti, L., & Borji, A. (2014). Computational models: bottom-up and top-down aspects. In A. C. Nobre & S. Kastner (Eds.), The Oxford Handbook of Attention (pp. 1122–1158). Oxford University Press.

  • Itti, L., & Koch, C. (2000). A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Research, 40(10–12), 1489–1506.

    PubMed  Google Scholar 

  • Itti, L., & Koch, C. (2001). Computational modeling of visual attention. Nature Reviews Neuroscience, 2(3), 194–203.

    PubMed  Google Scholar 

  • Johansson, R., Holsanova, J., Dewhurst, R., & Holmqvist, K. (2012). Eye movements during scene recollection have a functional role, but they are not reinstatements of those produced during encoding. Journal of Experimental Psychology: Human Perception and Performance, 38(5), 1289–1314.

    PubMed  Google Scholar 

  • Johansson, R., Holsanova, J., & Holmqvist, K. (2006). Pictures and spoken descriptions elicit similar eye movements during mental imagery, both in light and in complete darkness. Cognitive Science, 30(6), 1053–1079.

    PubMed  Google Scholar 

  • Johansson, R., & Johansson, M. (2014). Look here, eye movements play a functional role in memory retrieval. Psychological Science, 25(1), 236–242.

    PubMed  Google Scholar 

  • Johnson, M. R., & Johnson, M. K. (2014). Decoding individual natural scene representations during perception and imagery. Frontiers in Human Neuroscience, 8, 59.

    PubMed  PubMed Central  Google Scholar 

  • Kienzle, W., Franz, M. O., Schölkopf, B., & Wichmann, F. A. (2009). Center-surround patterns emerge as optimal predictors for human saccade targets. Journal of Vision, 9(5), 1–15.

    PubMed  Google Scholar 

  • Koch, C., & Ullman, S. (1985). Shifts in selective visual attention: Towards the underlying neural circuitry. Human Neurobiology, 4, 219–227.

    PubMed  Google Scholar 

  • Kosslyn, S. M., Sukel, K. E., & Bly, B. M. (1999). Squinting with the mind’s eye: Effects of stimulus resolution on imaginal and perceptual comparisons. Memory and Cognition, 27(2), 276–287.

    PubMed  Google Scholar 

  • Kosslyn, S. M., Thompson, W. L., & Ganis, G. (2006). The Case for Mental Imagery. Oxford University Press.

  • Laeng, B., Bloem, I. M., D’Ascenzo, S., & Tommasi, L. (2014). Scrutinizing visual images: The role of gaze in mental imagery and memory. Cognition, 131(2), 263–283.

    PubMed  Google Scholar 

  • Laeng, B., & Sulutvedt, U. (2014). The eye pupil adjusts to imaginary light. Psychological Science, 25(1), 188–197.

    PubMed  Google Scholar 

  • Laeng, B., & Teodorescu, D.-S. (2002). Eye scanpaths during visual imagery reenact those of perception of the same visual scene. Cognitive Science, 26(2), 207–231.

    Google Scholar 

  • Lauwereyns, J. (2012). Brain and the Gaze. On the Active Boundaries of Vision. The MIT Press.

  • Le Meur, O., Le Callet, P., & Barba, D. (2007). Predicting visual fixations on video based on low-level visual features. Vision Research, 47(19), 2483–2498.

    PubMed  Google Scholar 

  • Lee, S. H., Kravitz, D. J., & Baker, C. I. (2012). Disentangling visual imagery and perception of real-world objects. NeuroImage, 59(4), 4064–4073.

    PubMed  Google Scholar 

  • Mannan, S. K., Ruddock, K. H., & Wooding, D. S. (1996). The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images. Spatial Vision, 10(3), 165–188.

    PubMed  Google Scholar 

  • Martarelli, C. S., Chiquet, S., Laeng, B., & Mast, F. W. (2017). Using space to represent categories: Insights from gaze position. Psychological Research, 81(4), 721–729.

    PubMed  Google Scholar 

  • Martarelli, C. S., & Mast, F. W. (2011). Preschool children’s eye-movements during pictorial recall. British Journal of Developmental Psychology, 29, 425–436.

    Google Scholar 

  • Martarelli, C. S., & Mast, F. W. (2013). Eye movements during long-term pictorial recall. Psychological Research, 77(3), 303–309.

    PubMed  Google Scholar 

  • Naselaris, T., Olman, C. A., Stansbury, D. E., Ugurbil, K., & Gallant, J. L. (2015). A voxel- wise encoding model for early visual areas decodes mental images of remembered scenes. NeuroImage, 105, 215–228.

    PubMed  Google Scholar 

  • Parkhurst, D., Law, K., & Niebur, E. (2002). Modeling the role of salience in the allocation of overt visual attention. Vision Research, 42(1), 107–123.

    PubMed  Google Scholar 

  • Pearson, J., Naselaris, T., Holmes, E. A., & Kosslyn, S. M. (2015). Mental imagery: Functional Mechanisms and clinical applications. Trends in Cognitive Sciences, 19(10), 590–602.

    PubMed  PubMed Central  Google Scholar 

  • Richardson, D. C., & Spivey, M. J. (2000). Representation, space and Hollywood Squares: Looking at things that aren’t there anymore. Cognition, 76(3), 269–295.

    PubMed  Google Scholar 

  • Rouw, R., Kosslyn, S. M., & Hamel, R. (1997). Detecting high-level and low-level properties in visual images and visual percepts. Cognition, 63(2), 209–226.

    PubMed  Google Scholar 

  • Scholz, A., Klichowicz, A., & Krems, J. F. (2018). Covert shifts of attention can account for the functional role of “eye movements to nothing.” Memory & Cognition, 46(2), 230–243.

    Google Scholar 

  • Scholz, A., Mehlhorn, K., & Krems, J. F. (2016). Listen up, eye movements play a role in verbal memory retrieval. Psychological Research, 80(1), 149–158.

    PubMed  Google Scholar 

  • Schütz, A. C., Braun, D. I., & Gegenfurtner, K. R. (2011). Eye movements and perception: A selective review. Journal of Vision, 11(5), 1–30.

    Google Scholar 

  • Slotnick, S. D., Thompson, W. L., & Kosslyn, S. M. (2005). Visual mental imagery induces retinotopically organized activation of early visual areas. Cerebral Cortex, 15(10), 1570–1583.

    PubMed  Google Scholar 

  • Spivey, M. J., & Geng, J. J. (2001). Oculomotor mechanisms activated by imagery and memory: Eye movements to absent objects. Psychological Research, 65(4), 235–241.

    PubMed  Google Scholar 

  • Stokes, M., Thompson, R., Cusack, R., & Duncan, J. (2009). Top-down activation of shape-specific population codes in visual cortex during mental imagery. Journal of Neuroscience, 29(5), 1565–1572.

    PubMed  Google Scholar 

  • Tatler, B. W., Baddeley, R. J., & Gilchrist, I. D. (2005). Visual correlates of fixation selection: Effects of scale and time. Vision Research, 45(5), 643–659.

    PubMed  Google Scholar 

  • Umar, H., Mast, F. W., Cacchione, T., & Martarelli, C. S. (2021). The prioritization of visuo-spatial associations during mental imagery. Cognitive Processing. https://doi.org/10.1007/s10339-020-01010-5.

    Article  PubMed  Google Scholar 

  • Wantz, A. L., Martarelli, C. S., & Mast, F. W. (2016). When looking back to nothing goes back to nothing. Cognitive Processing, 17(1), 105–114.

    PubMed  Google Scholar 

  • Willenbockel, V., Sadr, J., Fiset, D., Horne, G. O., Gosselin, F., & Tanaka, J. W. (2010). Controlling low-level image properties: The SHINE toolbox. Behavior and Research Methods, 42(3), 671–684.

    Google Scholar 

  • Yarbus, A. L. (1967). Eye Movements and Vision. Plenum Press.

Download references

Acknowledgments

We thank Matthias Hartmann, Felix Wichmann, the editor and two anonymous reviewers for useful comments on a previous version of the manuscript.

Funding

The authors received no specific funding for this work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Corinna S. Martarelli.

Ethics declarations

Competing interests

The authors declare no competing interests.

Ethical approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. Informed consent was obtained from all individual participants included in the study. The study (including experiments 1 and 2) was approved by the faculty ethics committee (2013-2-333838, "Augenbewegungen und Vorstellungsprozesse", Ethikkommission der philosophisch-humanwissenschaftliche Fakultät, Bern Universität).

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix A

See Fig. 4.

Fig. 4
figure 4

Mean percentage of time spent for high contrast and low contrast visual mental images. Order of fixation is represented in the x-axis (1 = first fixation until 12 = 12th fixation). highSF highCON spatial frequency high–contrast high, highSF lowCON spatial frequency high–contrast low, lowSF highCON spatial frequency high–contrast high, lowSF lowCON spatial frequency low–contrast low. Visual inspection of the graphs suggests an overall predominance of the low spatial frequency–high contrast grating, both during perception and during imagery

Appendix B

See Table 4.

Table 4 Fixation time and number of fixations for high/low spatial frequency and high/low contrast gratings separated by task (perception and imagery)

Appendix C

See Fig. 5.

Fig. 5
figure 5

Mean percentage of fixations for high contrast and low contrast visual mental images. Time (100-ms time bins) is represented in the x-axis (6 s stimulus presentation). Visual inspection of the graph suggests an overall predominance of the high contrast images

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Martarelli, C.S., Mast, F.W. Pictorial low-level features in mental images: evidence from eye fixations. Psychological Research 86, 350–363 (2022). https://doi.org/10.1007/s00426-021-01497-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00426-021-01497-3

Navigation