Previous task-switching studies have shown that shifting attention to a relevant stimulus attribute before stimulus onset is limited: several EEG markers of perceptual encoding (including processes previously thought to be automatic, e.g., visual word recognition) are delayed on task-switch trials. This study investigated whether this was the case for another process of perceptual encoding thought to be automatic: recognising facial emotions. Facial emotion recognition is reported to have a well-established marker in the ERP (event-related potential), the emotion-expression effect (EEE; more positive amplitudes for emotional than neutral expressions in fronto-central parts of the scalp starting around 150 ms after stimulus onset). This study examined whether the EEE would be delayed when switching to the emotion classification task. Participants classified compound stimuli- face stimuli with a letter superimposed- depending on the task they were cued to perform on a particular trial (emotion classification: emotional or neutral, or letter classification: vowel or consonant). The task had to be repeated on a random 2/3rds of the trials or switched on 1/3rd of the trials. We collected behavioural data in form of reaction times and error rates and concurrently recorded the EEG from which the ERPs were derived.
Data were collected face-to-face in a laboratory and include behavioural data like reaction times and error rates (participants pressed instructed keys on a computer keyboard) and recordings of the EEG (electro-encephalo-gram; recorded with electrodes on participants' heads, amplified and stored on a computer). Participants were chosen through convenience sampling. We tested 20 students from the university of Exeter who volunteered to take part and were right-handed and had normal or corrected vision.