Publication date: Available online 4 July 2018
Source: Cortex
Author(s): Matthew X. Lowe, Jason Rajsic, Susanne Ferber, Dirk B. Walther
Abstract
Humans have the ability to make sense of the world around them in only a single glance. This astonishing feat requires the visual system to extract information from our environment with remarkable speed. How quickly does this process unfold across time, and what visual information contributes to our understanding of the visual world? We address these questions by directly measuring the temporal dynamics of the perception of colour photographs and line drawings of scenes with electroencephalography (EEG) during a scene-memorization task. Within a fraction of a second, event-related potentials (ERPs) show dissociable response patterns for global scene properties of content (natural versus manmade) and layout (open versus closed). Subsequent detailed analyses of within-category versus between-category discriminations found significant dissociations of basic-level scene categories (e.g., forest; city) within the first 100ms of perception. The similarity of this neural activity with feature-based discriminations suggests low-level image statistics may be foundational for this rapid categorization. Interestingly, our results also suggest that the structure preserved in line drawings may form a primary and necessary basis for visual processing, whereas surface information may further enhance category selectivity in later-stage processing. Critically, these findings provide evidence that the distinction of both basic-level categories and global properties of scenes from neural signals occurs within 100ms.
https://ift.tt/2MRNy2h
Δεν υπάρχουν σχόλια:
Δημοσίευση σχολίου