Abstract
We measured Event-Related Potentials (ERPs) and naming times to picture targets preceded by masked words (stimulus onset asynchrony: 80 ms) that shared one of three different types of relationship with the names of the pictures: (1) Identity related, in which the prime was the name of the picture ("socks" - ), (2) Phonemic Onset related, in which the initial segment of the prime was the same as the name of the picture ("log" - ), and (3) Semantically related in which the prime was a co-category exemplar and associated with the name of the picture ("cake" - ). Each type of related picture target was contrasted with an Unrelated picture target, resulting in a 3×2 design that crossed Relationship Type between the word and the target picture (Identity, Phonemic Onset and Semantic) with Relatedness (Related and Unrelated). Modulation of the N400 component to related (versus unrelated) pictures was taken to reflect semantic processing at the interface between the picture’s conceptual features and its lemma, while naming times reflected the end product of all stages of processing. Both attenuation of the N400 and shorter naming times were observed to pictures preceded by Identity related (versus Unrelated) words. No ERP effects within 600 ms, but shorter naming times, were observed to pictures preceded by Phonemic Onset related (versus Unrelated) words. An attenuated N400 (electrophysiological semantic priming) but longer naming times (behavioral semantic interference) were observed to pictures preceded by Semantically related (versus Unrelated) words. These dissociations between ERP modulation and naming times suggest that (a) phonemic onset priming occurred late, during encoding of the articulatory response, and (b) semantic behavioral interference was not driven by competition at the lemma level of representation, but rather occurred at a later stage of production.