Increasing Task Difficulty Enhances Effects of Intersensory Redundancy: Testing a New Prediction of the Intersensory Redundancy Hypothesis


Infants can detect information that is common and redundantly across the senses. This information includes temporal synchrony (most fundamental amodal property), rhythm, tempo, and changing intensity, all of which are labeled amodal (meaning not specific to a particular sense modality). Amodal information are considering “building blocks” of perceptual development and facilitate perception of unitary multimodal events and are the “gatekeeper” to processes an event as a whole. To explain selective attention, Bahrick and Lickliter proposed the Intersensory Redundancy Hypothesis (IRH), which states that intersensory redundancy is highly salient and directs selective attention to amodal aspects of events that are redundantly specified across senses. It makes three predictions, with a forth being proposed and tested in this new study.

The fourth prediction of the IRH states that intersensory facilitation occurs in the presence of tasks of relatively high difficulty to a person in terms of perception. In early development, infants have limited attentional resources, making perceptual processing difficult. Perceptual learning occurs throughout a person’s lifetime, which means intersensory facilitation should be present in later development when perceivers need to learn how to perceive finer distinctions in stimuli (learning a new language, playing an instrument).  As infants grow older, they have more attentional resources and increased perceptual differentiation. The more difficult a task difficulty is, the more differentiation is needed, reverting older infants to use patterns of intersensory facilitation seen in younger infants. They are expected to show better detection of amodal properties in bimodal, redundant stimulation than in unimodal, nonredundant stimulation.

Results of the study confirmed predictions, finding that infants who receive tempo contrasts of high difficult showed intersensory facilitation. Infants given tempo contrasts of low and moderate difficult showed discrimination in both conditions. Findings suggest that discrimination is key in task difficultly, and supports the prediction that as tasks get more difficult, older and more experienced infants show patterns of intersensory facilitation shown by younger infants.


Bahrick, L. E., Lickliter, R., Castellanos, I., Vaillant-Molina, M. (2010). Increasing task difficulty enhances effects of intersensory redundancy: Testing a new prediction of the intersensory redundancy hypothesis. Developmental Science13(5), 731–737. doi: 10.1111/j.1467-7687.2009.00928.x

The Early Catastrophe: The 30 Million Word Gap by Age 3

Read about the 30 Million Word Gap for lab and I thought it was incredibly interesting. The fact that a 3-year-old’s vocabulary can predict their language skills when they’re 9 to 10 years old is proof of why we need effective intervention programs.

  • Language-intensive activities in a preschool located in a low SES area caused children to learn a spurt of words while also causing an abrupt acceleration in cumulative vocabulary growth curves. However, researchers “could not accelerate the rate of vocabulary growth so that it would continue beyond direct teaching.” Like other early intervention programs, the increases in vocabulary were temporary, and when children started kindergarten a year later, the effects of the boost in children’s vocabulary wore out. Disparities among developmental trajectories of vocabulary growth were suggested to be caused by SES – professor’s children knew and were exposed to more words than children from a lower SES background. To me, this suggests that in order for early intervention programs to have a long term impact, they must be administrated throughout the child’s entire educational lifespan.
  • Vocabulary use at age 3 is predictive of language skill at age 9-10. Differences in early experiences do not wash out like the effects of preschool intervention. It seems like parents can predict how the child would do in school by the time they’re two years old. To create an effective intervention program, you have to give all children the same early experiences. Looking at only the amount of words heard by children, the average child in welfare heard 616 words, half the amount of words heard by the average working-class child (which is 1,251) and less than one third of words heard by the average child of a professor (2,153). This means that in four years, there is a significant difference of accumulated experience with words between each of the groups: the average child in a professional child is experienced with 43 million words, average working-class child is experienced with 26 million words, average welfare child is experienced with 13 million words. This is when we really see how big the problem is and how important early intervention is. In order for welfare child to essentially “catch up” to children of professional families, a lot of time and effort is required to equalize their experiences. The longer we wait to intervene, the more less possible change and equalization will become.
  • Also interesting is that there is a significant difference in the children’s hourly experience with encouraging words and prohibitions. By the age of 4, the average child in welfare might have heard 144,000 fewer encouragements and 84,000 more discouragements of their behavior than the average child of a working-class family.
  • What is the impact of hearing more discouragements than encouragements?
  • Were children in higher SES families encouraged for behaviors that were discouraged in welfare families?
  • Similar studies with bilingual families?
  • Were any children in the study bilingual? Did the study just count only words in English or did it also include non-English words?
  • Children in daycare vs. not in daycare
    • difference in vocabulary?


Hart, B., Risley, T. R. (2003). The Early Catastrophe: The 30 Million Word Gap by Age 3. The American Educator27(1), 4-9.

Infant Lab Terms from Encyclopedia Chapter

Some  of my notes from Bahrick, Hollich – Encyclopedia Chapter:

Intermodal perception/intersensory perception – the perception of unitary objects or events that make information simultaneously available to more than one sense

Intersensory redundancy: when the same amodal info is simultaneously available through difference senses

Intersensory redundancy hypothesis: amodal properties are detected easier in multimodal stimulation than in unimodal stimulation; modality specific properties are detected easier in unimodal stimulation; “facilitating attention to amodal information but impairing attention to modality-specific information in multimodal events”

Event perception: amodal relations provide basis for selecting unitary sights and sounds

Audiovisual perception: synchronized sight and sound

Speech perception: language units visual with auditory stimulation

Face-voice perception: broad during first year and narrows as infants have more experience with human faces than other faces; infants able to match unfamiliar faces and voices based on age and gender of speaker (i.e. males have deeper voices than children); link shape of a person’s mouth with sounds they make

Self-perception: visual-motor perception – infants can discriminate between self and others

Amodal information: not specific to a particular sense; can be characterized: time, space, and intensity

Modality-specific information: one particular modality (color, pattern, temperature)

Visual-tactile perception: information relating to shape, texture, substance, and size of object invariant across visual and tactile stimulation; tactile exploration facilitates their perception of amodal, 3-D shape of objects

Visual-motor perception: infants able to perceive their own body motion by detecting amodal info; proprioception is information about self-motion provided by feedback by muscles, joints, and vestibular system; infants take amodal info between proprioception experience of their own limbs motions and of visual image in a video to make a discrimination between self and other; visually guided reaching; posture-control

Perceptual Development: Intermodal Perception

More notes/summaries from lab:

Intermodal perception refers to perception of information from objects and events that is available simultaneously to multiple senses. Philosophers proposed that we need to integrate information across different senses before we could perceive an object or an event.

Piaget proposed that integration took place by interacting with objects and coordinating information across the different senses. Gibson proposed that the senses work in a unified perceptual system and that different forms of sensory stimulation were necessary to perceive unified events.  His view was different from the integration view because he proposed that the senses are unified at birth – not that the senses are separate at birth.

As infants grow older, they learn to perceive subtler differences and complex objects and events in their environment, moving away from just simply detecting general features of unified multimodal events.

Evidence supports the belief that temporal synchrony between sights and sounds are “the glue” that binds information across the senses. For example, when someone claps, we detect synchrony, rhythm, and tempo. We look at the event as a whole – we don’t pay attention to the sights, sounds, or tactile stimulation separately.

Amodal information that is provided and is synchronized across more than one sense modality is called intersensory information. An infant will notice the intersensory redundancy between the face and the voice of a person speaking before they notice modality-specific information, such as pitch of the voice. When redundancy is not available, an infant will look at nonredundant information, such as the appearance of a person’s face. Since selective attention is the basis for what we perceive, learn, and memorize, intersensory redundancy is highly influential in organizing early perceptual, cognitive, social, and emotional development.

Intersensory redundancy allows the development of more specific processing. As infants grow older, detection of sight-sound relations allows detection of more specific amodal relations, such as tempo, and this then allows the detection of arbitrary, modality-specific relations, such as pitch and color.

Protoconversation is the intercoordinated mutual exchange of sounds, movements, and touch; it’s the foundation for communication and social development.

Speech is inherently multimodal, due to the coordinating of facial, vocal, and gestural information; audiovisual redundancy promotes learning, as well.

Proprioception is information about self-movement based on feedback from muscles, joints, and balance system. By three to five months, infants can discriminate between a live video of their own legs kicking verses one of another infant by detecting temporal synchrony and spatial correspondence between their proprioceptive feedback and the display of their own legs kicking.

Bahrick, L.E., & Lickliter, R. (2009). Perceptual development: Intermodal Perception. Encyclopedia of Perception, 2, 753 – 756.

Intermodal Perception

More notes/summaries from lab:

Shared confusion and interest relating to the specificity of the different senses and the overlap between them allowed scientists to create different proposals for understanding how perception works. It was proposed that different forms of sensory information do not disrupt perception, but rather they’re vital to intermodal perception (stimulation that is simultaneously present for more than one sense). Amodal information, an invariant for intermodal perception, is information that is not attached to a specific particular sense modality and is shared across more than one sense; it’s redundant across the different senses due to temporal, spatial, and intensity patterns. When amodal information is available through different senses at the same time, it’s called intersensory redundancy. Modality specific information is information that is only perceived by one sense (i.e. color can only be perceived visually).

We can only pay attention to selective, small portion of stimulation available, as irrelevant information stays in the background. Our senses take in redundant information about events and objects in a way that it’s not excessive. This salient intersensory redundancy is fundamental for the development of perception in infancy. Sensitivity to redundancy across the different senses promotes attention to unified events when presented with competing sounds and movements. As we grow older, the sensory information goes through finer and finer levels of stimulation (i.e. general perception of a person walking to ultimately their appearance). Perceptual narrowing occurs with progressive improvements in perceptual discrimination; it allows people to focus on relevant aspects of stimulation and ignore those that are irrelevant. As time goes on, perceptual development focuses on becoming increasingly more specific.

Infants are sensitive to audiovisual information and can detect the temporal synchrony between sights and sounds of an object’s impact, among other things. Temporal synchrony is thought to be the “glue” that binds information across auditory and visual stimulation. Amodal relations such as temporal synchrony, rhythm, and tempo are the basis for selecting unitary sights and sounds; it eventually allows processing of temporal microstructure and modality-specific properties such as color, pattern, and pitch. The intersensory redundancy hypothesis suggests that redundant information is highly salient and attracts amodal, redundantly specified properties of information.

Infants are able to recognize other people’s emotional expressions and use them as information about external events. This skill, called social referencing, can allow an infant to discriminate an adult’s emotional expression in order to connecting it the adult’s feelings towards a situation or object. Audiovisual synchrony can help in word-learning and in selective attention.

Bahrick, L. E. & Hollich, G. (2008). Intermodal perception. Encyclopedia of Infant and Early Childhood Development, 2, 164 – 176.

Perceptual Development: Amodal Perception

In order to be a research assistant in the Infant Development Lab I volunteer at, I have to read articles written by the lab director and summarize them. Studies and concepts discussed are usually very neat (my personal favorite is the ventriloquist effect). Hopefully my summaries don’t come across as too choppy. Here’s one I wrote for amodal perception:

Amodal perception is perception of information that is redundant across multiple senses, and it includes changes in three types of stimulation: time, space and intensity. Because all events occur across time and space, all events have amodal information. For example, speech provides changes in audiovisual synchrony, tempo, rhythm and intonation that are common facial movements and vocal sounds. Self-motion produces information from muscles and joints, and that information is synchronized and shares temporal and intensity changes with the sight of self-motion. Amodal has also been used to refer to perception in the absence of direction information from one particular sense modality.

The concept of amodal perception dates back to more than 2,000 years ago. Since then, philosophers have proposed that sensations have to be interpreted across the senses before a person could perceive meaningful objects and events. A more three-dimensional approach was created by developmental psychologist when they stated the process was developed gradually through experience with objects. Gibson later proposed that different forms of sensory stimulation were not a problem for perception, but rather necessary for perceiving unitary objects and events – our senses work together as a unified perceptual system to pick up information that is common among the senses.

Amodal information is highly salient to humans and animals, especially during early development. Development of some skills depend on the detection of amodal information, such as being able to detect temporal synchrony, rhythm and tempo, as well as being able to detect emotion. Amodal information simplifies and organizes incoming sensory stimulation, which allows us to perceive unitary, multimodal events.

Bahrick, L.E. (2009). Perceptual development: Amodal perception. Encyclopedia of Perception, 1, 44-46.