Multi-time resolution analysis of speech: Evidence from psychophysics
There is behavioral evidence from psychophysics that phoneme-sized (10–40 Hz) and syllable-sized (2–10Hz) information are bound to the different time scales (Chait, Greenberg, Arai, Simon,and Poeppel, 2015). An interesting question is, how can these different time scales relevant for speech processing that are present in parallel be reflected in the brain’s activity?Giraud and Poeppel (2012), who argue for a principled relation between time scales in speech and frequencies of cortical oscillatory activations, provide an answer to this question. Oscillations of cortical activations come in different frequency bands: delta (1–4Hz), theta (4–8 Hz), beta (13–30 Hz), and gamma (30–70 Hz). In their study, Giraud and Poeppel (2012) showed that phonemic, syllabic, and phrasal processing, which take place in parallel, are reflected in nested theta-gamma band oscillation patterns in the auditory cortex. Thus, the brain’s solution to parallel processes may be to nest different frequency bands that reflect different aspects of processing—for example phoneme processing and syllable processing—into each other.
The statistical significance filter leads to overconfident expectations of replicability