Abstract
Despite decades of research, it remains controversial whether semantic knowledge is anatomically segregated in the human brain. To address this question, we recorded event-related potentials (ERPs) while participants viewed pictures of animals and tools. Within the 200-600-ms epoch after stimulus presentation, animals (relative to tools) elicited an increased anterior negativity that, based on previous ERP studies, we interpret as associated with semantic processing of visual object attributes. In contrast, tools (relative to animals) evoked an enhanced posterior left-lateralized negativity that, according to prior research, might reflect accessing knowledge of characteristic motion and/or more general functional properties of objects. These results support the hypothesis of the neuroanatomical knowledge organization at the level of object features: the observed neurophysiological activity was modulated by the features that were most salient for object recognition. The high temporal resolution of ERPs allowed us to demonstrate that differences in processing animals and tools occurred specifically within the time-window encompassing semantic analysis.