literature review Research Report Gesture Paves the Way for Language Development Jana M. Iverson1 and Susan Goldin-Meadow2 1 University of Pittsb

literature review

Research Report

Don't use plagiarized sources. Get Your Custom Assignment on
literature review Research Report Gesture Paves the Way for Language Development Jana M. Iverson1 and Susan Goldin-Meadow2 1 University of Pittsb
From as Little as $13/Page

Gesture Paves the Way for
Language Development
Jana M. Iverson1 and Susan Goldin-Meadow2

1
University of Pittsburgh and

2
University of Chicago

ABSTRACTIn development, children often use gesture to

communicate before they use words. The question is

whether these gestures merely precede language develop-

ment or are fundamentally tied to it. We examined 10

children making the transition from single words to two-

word combinations and found that gesture had a tight

relation to the childrens lexical and syntactic develop-

ment. First, a great many of the lexical items that each

child produced initially in gesture later moved to that

childs verbal lexicon. Second, children who were first to

produce gesture-plus-word combinations conveying two

elements in a proposition (point at bird and say nap)

were also first to produce two-word combinations (bird

nap). Changes in gesture thus not only predate but also

predict changes in language, suggesting that early gesture

may be paving the way for future developments in lan-

guage.

Young children communicate using gestures before they are

able to speak. Children typically produce their first gestures

between 9 and 12 months, usually pointing to indicate objects in

the environment (Bates, 1976; Bates, Benigni, Bretherton,

Camaioni, & Volterra, 1979). Even after children begin to talk,

they continue to produce gestures in combination with words

(e.g., pointing at cup while saying cup; e.g., Greenfield &

Smith, 1976), and these gesture-plus-word combinations gen-

erally precede production of two-word combinations. Gesture

development thus predates language development. The ques-

tion we address here is whether gesture is fundamentally tied to

language development.

The gestures that children produce early in language devel-

opment provide a way for them to communicate information that

they cannot yet express verbally. For example, pointing gestures

(e.g., point at cup) offer children a technique for referring to

objects before they have words for those objects. Moreover,

gesture-plus-word combinations offer children a technique for

communicating two pieces of information within a single

utterance before they can produce two-word utterances (e.g.,

point at cup while saying mine; Butcher & Goldin-Meadow,

2000; Capirci, Iverson, Pizzuto, & Volterra, 1996; Goldin-

Meadow & Butcher, 2003). The fact that gesture allows children

to communicate meanings that they may have difficulty ex-

pressing verbally raises the possibility that gesture serves a

facilitating function for language learning. If so, changes in

gesture should not only predate but also predict changes in

language.

We tested this hypothesis by examining gesture production in

relation to lexical and syntactic development in the early stages

of language development. We asked (a) whether childrens use

of gesture to refer to specific objects is related to the emergence

of verbal labels for those objects and (b) whether childrens

production of gesture-plus-word combinations is related to the

emergence of two-word utterances.

METHOD

Participants

Ten typically developing children (5 males, 5 females) partic-

ipated; all were from middle- to upper-middle-class monolin-

gual English-speaking families. The children were followed

longitudinally between the ages of 10 and 24 months. We focus

here on sessions between the onset of one-word speech (range:

1014 months) and the emergence of two-word combinations

(range: 1723 months). On average, each child was observed

8 times (range: 512).

Procedure

The children were videotaped monthly for approximately 30

min. The taping took place in the home, during play with a

primary caregiver and during a snack or mealtime. Toys were

provided by the experimenter, but the children were also free to

play with their own toys.

Coding

We focused on gestures and speech used communicatively. The

child had to make an effort to direct the listeners attention (e.g.,

Address correspondence to Jana M. Iverson, Department of Psy-
chology, University of Pittsburgh, 3415 Sennott Square, 210 S.
Bouquet St., Pittsburgh, PA 15260; e-mail: [emailprotected]

PSYCHOLOGICAL SCIENCE

Volume 16Number 5 367Copyright r 2005 American Psychological Society

through eye gaze, vocalization, postural shift) for a behavior to be

considered communicative. A communicative behavior could

be gesture on its own, speech on its own, or gesture and speech

produced together.

Coding Gesture

Two additional criteria were used to ensure that a gesture was

functioning as a communicative symbol (see Butcher, Mylander,

& Goldin-Meadow, 1991; Goldin-Meadow & Mylander, 1984):

First, the gesture could not be a direct manipulation of some

relevant person or object (i.e., it had to be empty-handed;

Petitto, 1988). All acts performed on objects were excluded,

except for instances in which a child held up an object to bring

it to another persons attention, an act that serves the same

function as pointing. Second, the gesture could not be a ritual

act (e.g., blowing a kiss to someone) or game (e.g., patty-cake).

Each gesture was classified into one of three categories: deic-

tic gesture, conventional gesture, or ritualized reach. Deictic

gestures indicate referents in the immediate environment.

Children produced three types of deictic gestures: (a) showing,

holding up an object in the listeners potential line of sight; (b)

index point, extending the index finger toward a referent; and (c)

palm point, extending a flat hand toward a referent. The referent

of a deictic gesture was assumed to be the object indicated (or

held up) by the hand.
1
Conventional gestures have a form and

meaning that are either culturally defined (e.g., nodding the

head yes) or specified in the context of particular caregiver-

child interactions (e.g., smoothing the hands over the hair to

mean pretty). Ritualized reaches are arm extensions toward an

object, usually accompanied by repeated opening and closing of

the palm.

Coding Speech

We coded all communicative, meaningful vocalizations; these

consisted of either English words (e.g., dog, hot, walking)

or patterns of speech sounds consistently used to refer to a

specific object or event (e.g., [ba] for bottle).

Coding the Relation Between Gesture and Speech

All instances in which a gesture was produced co-temporally

with speech were classified as gesture-plus-word combinations

and divided into two categories based on the relation between

the information conveyed in the two modalities. One category

included gestures that complemented speech by singling out the

referent indicated by the accompanying word (e.g., pointing to

flowers while saying flowers to indicate flowers on the table).

The second category included gestures that supplemented

speech by providing a different but related piece of information

about the referent (e.g., pointing to a picture of a bird while

saying nap to indicate that the bird in the picture is sleeping).

Reliability

Reliability between two independent coders was assessed for

10% of the 80 sessions. Agreement between coders was 93% (N

5 639) for isolating gestures and 100% (N 5 52) for classifying

gesture-plus-word combinations as complementary or supple-

mentary. Cohens kappa statistics for these coding decisions

were .92 and 1.0, respectively. Agreement was 100% (N 5 242)

for assigning meanings to gestures and 91% (N 5 463) for

assigning meanings to words.

RESULTS

Object Reference in Gesture and Early Lexical

Development

Do the early gestures that a child produces have any relation to

the words that the child subsequently utters? For these analy-

ses, we identified all instances in which children referred to an

object
2
and classified them into three categories: speech only

(i.e., using only a word to refer to an object), gesture only (i.e.,

using only a gesture to refer to an object), or speech and gesture

(i.e., using both a word and a gesture, not necessarily at the

same time, to refer to an object). Intercoder reliability for this

decision was 92% (N 5 119), k 5 .85. Because we were in-
terested in examining developmental change in the number of

different items in childrens verbal and gestural repertoires, this

analysis was based on types (a traditional measure of vocabu-

lary growth) within a session. For example, if a child only

pointed at a ball (one or more times) during the session, ball was

counted as one type in the gesture-only category. If the child

only said ball (one or more times) during the session, ball was

counted as one type in the speech-only category. If a child

produced the word ball and also pointed at a ball in the same

session (whether simultaneously or at different times), we

counted ball as one type in the speech-and-gesture category. We

then calculated the proportion of items (summed across ses-

sions) that each child produced in each of the three categories.

The children relied extensively on gesture to refer to objects:

Approximately half of each childs object references across

sessions occurred in gesture only (M 5 .50, SD 5 .16), with

another quarter occurring in both speech and gesture (M 5 .22,

SD 5 .06). Only a fourth of the object references that each child

produced occurred in speech only (M 5 .28, SD 5 .18). But

gesture did become less important over time. At the initial

1
It is possible that, at times, children used deictic gestures to refer to events

rather than objects (e.g., cat sleeping rather than cat). Our results, however,
do not support this possibility. When points were assumed to refer to objects,
childrens pointing gestures predicted subsequent entries in their spoken vo-
cabularies, and onset of childrens supplementary gesture-plus-word combina-
tions predicted onset of two-word utterances; our results thus provide indirect
support for coding points as references to objects. If we misattributed the ref-
erents of pointing gestures, this would only have weakened our results and re-
duced the likelihood that they would support the gesture-facilitation hypothesis.

2
Only nouns and deictic gestures were included in the lexical analyses.

Pronouns were infrequent and thus omitted; in analyses including pronouns, the
results were unchanged.

368 Volume 16Number 5

Gesture and Language Development

session, 9 of the 10 children produced a majority of object

references in gesture only, whereas none did at the final session,

w2(1) 5 12.93, p < .001. Gesture thus appears to provide a way for children to refer to objects at a time when they are not producing words for those objects. If gesture serves a facilitating function in lexical de- velopment, one might expect an individual lexical item to enter a childs repertoire first in gesture and then, over time, transfer to speech. To explore this possibility, we identified lexical items that a child used in multiple sessions and classified them into four categories 3 according to whether they (a) appeared initially in speech and remained in speech, (b) appeared initially in gesture and remained in gesture, (c) appeared initially in speech and transferred or spread to gesture, or (d) appeared initially in gesture and transferred or spread to speech. Items that appeared initially in both speech and gesture were ex- cluded from this analysis. Table 1 presents the mean proportion of items that fell into each category. Modality had a clear impact on lexical devel- opment. Significantly more items were produced initially in gesture than in speech, F(1, 9) 5 12.33, p < .01, Z2 5 .578. Moreover, a significant proportion of the items either switched or spread from one modality to the other (as opposed to staying in one modality), F(1, 9) 5 8.05, p < .03, Z2 5 .472. However, there was a significant interaction between the two factors, F(1, 9) 5 20.37, p < .002, Z2 5 .694: Items were more likely to move from gesture to speech than from speech to gesture ( p < .001, Newman-Keuls). On average, children produced a gesture for a particular object 3.0 months (SD 5 0.54, range: 2.3 to 3.9 months) before they produced the word for that object. Thus, the results are consistent with the gestural-facilitation hypothesis, as we were able to predict a large proportion of the lexical items that eventually appeared in a childs verbal repertoire from that childs earlier gestures. Because the relation between a deictic gesture and its referent is more transparent than the arbitrary relation between most words and their referents, gesture can provide children with a temporary way to communicate about objects, allowing them to circumvent difficulties related to producing speech (Acredolo & Goodwyn, 1988; Werner & Kaplan, 1963). Gesture may thus serve as a transitional device in early lexical development. Gesture-Plus-Word Combinations and the Transition to Two-Word Speech All 10 children combined single gestures with single words and did so several months before producing two-word utterances. Moreover, all 10 children produced both supplementary (point at bird while saying nap) and complementary (point at bird while saying bird) gesture-plus-word combinations before the onset of two-word utterances (bird nap). The mean interval between the onset of supplementary gesture-plus-word combi- nations and onset of two-word utterances was 2.3 months (SD 5 1.66); the corresponding interval between the onset of com- plementary gesture-plus-word combinations and the onset of two-word combinations was 4.7 months (SD 5 2.2). 4 Note that like two-word combinations, supplementary ges- ture-plus-word combinations communicate two semantic ele- ments within a single communicative act. If gesture facilitates the emergence of early speech combinations, one might expect children who produce supplementary gesture-plus-word com- binations to be the first to make the transition to two-word speech. And indeed, we found a significant correlation between age of onset of supplementary gesture-plus-word combinations

and age of onset of two-word combinations (Spearman rs 5 .94,

p < .001, two-tailed; see Fig. 1). Unlike supplementary gesture-plus-word combinations, com- plementary combinations convey a single semantic element. One therefore would not expect the onset of this type of gesture-plus-word combination to predict the onset of two- word utterances, and, indeed, it did not. The correlation be- tween age of onset of complementary gesture-plus-word com- binations and age of onset of two-word combinations was low and not reliable (Spearman rs 5 .24, n.s.; see Fig. 1). Thus, it is the ability to combine two different semantic elements within a single communicative actnot simply the ability to produce gesture and speech togetherthat predicts the onset of two- word speech. DISCUSSION We have found that gesture both precedes and is tightly related to language development. At the lexical level, items found in- itially in childrens gestural repertoires subsequently appeared in their verbal lexicons. At the sentence level, the onset of gesture-plus-word combinations conveying two elements of a TABLE 1 Categorization of Lexical Items According to Modality of First Appearance and Developmental Trajectory Developmental trajectory of the item Modality in which the item first appeared Speech Gesture Remained in one modality .16 (.13) .25 (.13) Switched or spread to the other modality .09 (.06) .50 (.12) Note. The numbers shown are mean proportions (with standard deviations in parentheses). Only lexical items that appeared in multiple observation sessions were included in this analysis. 3 Lexical items appearing in multiple sessions accounted for .41 (range: .29 .49) of each childs repertoire. 4 Age of onset for complementary, supplementary, and two-word combinations was defined as the childs age at the session in which he or she first produced at least two instances of the respective kind of combination. Volume 16Number 5 369 Jana M. Iverson and Susan Goldin-Meadow proposition predicted with great precision the onset of two-word combinations. Our findings are thus consistent with the hy- pothesis that gesture plays a facilitating role in early language development. What might gesture be doing to facilitate language learning? One possibility is that gesture serves as a signal to the childs communicative partner that the child is ready for a particular kind of verbal input. Consider a child who points at his or her fathers hat while saying dada. The childs caregiver might respond by saying, Yes, thats daddys hat, in effect trans- lating the childs gesture-plus-word combination into a two-

word utterance and providing the child with timely verbal input.

Indeed, adults have been found to alter their input to older

children on the basis of the gestures that the children produce

(Goldin-Meadow & Singer, 2003), providing them with in-

struction that leads to learning (Singer & Goldin-Meadow,

2005).

Gesture may also play a role in language learning by affecting

the learners themselves. Although gesture and speech form a

single integrated system, gesture exploits different representa-

tional resources than does speech (McNeill, 1992). Meanings

that lend themselves to visuospatial representation may be

easier to express in gesture than in speech. Indeed, children on

the cusp of mastering a task often produce strategies for solving

the task in gesture before producing them in speech (Church &

Goldin-Meadow, 1986; Perry, Church, & Goldin-Meadow, 1988).

In addition to relying on a different representational format,

gesture lessens demands on memory. Pointing at an object is

likely to put less strain on memory than producing a word for

that object. Moreover, gesturing while speaking has been found

to save speakers cognitive effort (Goldin-Meadow, Nusbaum,

Kelly, & Wagner, 2001; Wagner, Nusbaum, & Goldin-Meadow,

2004); consequently, it may be cognitively less demanding to

express a proposition in a gesture-plus-word combination than

in two words.

Gesture may thus provide a way for new meanings to enter

childrens communicative repertoires. It may also give children

a means for practicing these new meanings, laying the foun-

dation for their eventual appearance in speech. There is, in fact,

evidence that the act of gesturing can itself promote learning

(Wagner & Goldin-Meadow, 2004).

In sum, our findings underscore the tight link between gesture

and speech, even in children at the earliest stages of language

learning. At minimum, gesture is a harbinger of change in the

childs developing language system, as it is in other cognitive

systems later in development (Goldin-Meadow, 2003). Gesture

may even pave the way for future developments in language.

AcknowledgmentsThis research was supported by grants

from the National Institute of Child Health and Human De-

velopment (Grant R01 HD 41677 to J.M.I. and Grant P01 HD

40605 to S.G.-M.) and by a grant from the March of Dimes

Foundation to S.G.-M.

REFERENCES

Acredolo, L.A., & Goodwyn, S. (1988). Symbolic gesturing in normal

infants. Child Development, 59, 450466.
Bates, E. (1976). Language and context. New York: Academic Press.
Bates, E., Benigni, L., Bretherton, I., Camaioni, L., & Volterra, V.

(1979). The emergence of symbols: Cognition and communication
in infancy. New York: Academic Press.

Butcher, C., & Goldin-Meadow, S. (2000). Gesture and the transition

from one- to two-word speech: When hand and mouth come to-

gether. In D. McNeill (Ed.), Language and gesture (pp. 235258).
New York: Cambridge University Press.

Fig. 1. Scatter plots displaying the relation between age of onset of supplementary gesture-plus-word combi-
nations and age of onset of two-word combinations (left) and between age of onset of complementary gesture-
plus-word combinations and age of onset of two-word combinations (right).

370 Volume 16Number 5

Gesture and Language Development

Butcher, C., Mylander, C., & Goldin-Meadow, S. (1991). Displaced

communication in a self-styled gesture system: Pointing at the

nonpresent. Cognitive Development, 6, 315342.

Capirci, O., Iverson, J.M., Pizzuto, E., & Volterra, V. (1996). Commu-

nicative gestures during the transition to two-word speech. Jour-
nal of Child Language, 23, 645673.

Church, R.B., & Goldin-Meadow, S. (1986). The mismatch between

gesture and speech as an index of transitional knowledge. Cog-
nition, 23(1), 4371.

Goldin-Meadow, S. (2003). Hearing gesture: How our hands help us
think. Cambridge, MA: Harvard University Press.

Goldin-Meadow, S., & Butcher, C. (2003). Pointing toward two-word

speech in young children. In S. Kita (Ed.), Pointing: Where lan-
guage, culture, and cognition meet (pp. 85107). Mahwah, NJ:
Erlbaum.

Goldin-Meadow, S., & Mylander, C. (1984). Gestural communication in

deaf children: The effects and non-effects of parental input on

early language development. Monographs of the Society for Re-
search in Child Development, 49(1, Serial No. 121).

Goldin-Meadow, S., Nusbaum, H., Kelly, S.D., & Wagner, S. (2001).

Explaining math: Gesturing lightens the load. Psychological
Science, 12, 516522.

Goldin-Meadow, S., & Singer, M.A. (2003). From childrens hands to

adults ears: Gestures role in teaching and learning. Develop-
mental Psychology, 39, 509520.

Greenfield, P., & Smith, J. (1976). The structure of communication in
early language development. New York: Academic Press.

McNeill, D. (1992). Hand and mind: What gesture reveals about
thought. Chicago: University of Chicago Press.

Perry, M., Church, R.B., & Goldin-Meadow, S. (1988). Transitional

knowledge in the acquisition of concepts. Cognitive Development,
3(4), 359400.

Petitto, L.A. (1988). Language in the pre-linguistic child. In F.

Kessel (Ed.), The development of language and language re-
searchers: Essays in honor of Roger Brown (pp. 187221). Hills-
dale, NJ: Erlbaum.

Singer, M.A., & Goldin-Meadow, S. (2005). Children learn when their

teachers gestures and speech differ. Psychological Science, 16,
8589.

Wagner, S.M., & Goldin-Meadow, S. (2004). The role of gesture in
learning: Can children use their hands to change their minds?
Manuscript submitted for publication.

Wagner, S.M., Nusbaum, H., & Goldin-Meadow, S. (2004). Probing the

mental representation of gesture: Is handwaving spatial? Journal
of Memory and Language, 50, 395407.

Werner, H., & Kaplan, B. (1963). Symbol formation. New York: Wiley.

(RECEIVED 4/30/04; REVISION ACCEPTED 8/26/04)

Volume 16Number 5 371

Jana M. Iverson and Susan Goldin-Meadow Brief article

Gesture is at the cutting edge of early language

development

Seyda Ozcalskan*, Susan Goldin-Meadow

Department of Psychology, University of Chicago, 5848 S. University Avenue, Green 317,

Chicago, IL 60637, USA

Received 26 November 2004; accepted 14 January 2005

Abstract

Children who produce one word at a time often use gesture to supplement their speech,

turning a single word into an utterance that conveys a sentence-like meaning (eatCpoint at
cookie). Interestingly, the age at which children first produce supplementary gesturespeech

combinations of this sort reliably predicts the age at which they first produce two-word

utterances. Gesture thus serves as a signal that a child will soon be ready to begin producing

multi-word sentences. The question is what happens next. Gesture could continue to expand a

childs communicative repertoire over development, combining with words to convey

increasingly complex ideas. Alternatively, after serving as an opening wedge into language,

gesture could cease its role as a forerunner of linguistic change. We addressed this question in a

sample of 40 typically developing children, each observed at 14, 18, and 22 months. The

number of supplementary gesturespeech combinations the children produced increased

significantly from 14 to 22 months. More importantly, the types of supplementary combinations

the children produced changed over time and presaged changes in their speech. Children

produced three distinct constructions across the two modalities several months before these

same constructions appeared entirely within speech. Gesture thus continues to be at the cutting

edge of early language development, providing stepping-stones to increasingly complex

linguistic constructions.

q 2005 Elsevier B.V. All rights reserved.

Keywords: Gesture; Two-word speech; Argument structure; Gesturespeech relation
Cognition 96 (2005) B101B113
www.elsevier.com/locate/COGNIT
0022-2860/$ – see front matter q 2005 Elsevier B.V. All rights reserved.

doi:10.1016/j.cognition.2005.01.001

* Corresponding author. Tel.: C1 773 834 1447; fax: C1 773 834 5261.

E-mail address: [emailprotected] (S. Ozcalskan).

http://www.elsevier.com/locate/COGNIT

S. Ozcalskan, S. Goldin-Meadow / Cognition 96 (2005) B101B113B102
1. Gestures role in early language-learning

At a certain stage in the process of learning language, children produce one word at a

time. They have words that refer to objects and people and words that refer to actions and

properties in their productive vocabularies (Nelson, 1973). However, they do not combine

these words into sentence-like strings.

Interestingly, at the earliest stages of language learning, children also fail to combine

their words with gesture. They use deictic gestures to point out objects, people, and

places in the world, and iconic gestures to convey relational information as early as 10

months (Acredolo & Goodwyn, 1985, 1989; Bates, 1976; Bates, Benigni, Bretherton,

Camaioni & Volterra, 1979; Greenfield & Smith, 1976; Iverson, Capirci, & Caselli,

1994). However, they do not combine these gestures with wordsdespite the fact that,

during this same period, they are able to combine gestures with meaningless

vocalizations (e.g. grunts, exclamations; Butcher & Goldin-Meadow, 2000). Producing

meaningful words and gestures in a single combination thus appears to be a significant

developmental step.

Children take the developmental step that allows them to combine words with

gestures several months before they take the step that enables them to combine words

with other words (Capirci, Iverson, Pizzuto, & Volterra, 1996; Goldin-Meadow &

Butcher, 2003; Greenfield & Smith, 1976). For example, before a child produces two-

word utterances, the child is able to point at a cup while saying the word cup or,

more interestingly, point at a cup while saying the word mommy. Note that this

second type of gesturespeech combination provides children with a technique for

conveying sentence-like information before they are able to convey that same

information in words alone (mommy cup). Gesturespeech combinations of both

types precede the onset of two-word utterances. The question we address in this paper

is what role these gesturespeech combinations play in the development of childrens

first sentences.

There is, in fact, evidence that childrens gesturespeech combinations are related

to their first two-word utterances. The age at which children first produce gesture

speech combinations conveying sentence-like information (e.g. mommyCpoint at
cup) is highly correlated with the age at which they begin to produce their first two-

word utterances (Goldin-Meadow & Butcher, 2003; Iverson & Goldin-Meadow, in

press). Importantly, the onset of combinations in which gesture is redundant

with speech (e.g. mommyCpoint at mommy) does not predict the onset of
two-word utterances. It is the relation between gesture and speech, and not the

presence of gesture per se, that predicts when children will first produce multi-word

combinations.

A childs ability to convey sentence-like meanings across gesture and speech is

thus a signal that the child will soon be able to convey these meanings entirely within

speech. But if there is truly a tight link between early gesturespeech combinations

and later language development, we ought to be able to see precursors of particular

sentence constructions in childrens early gesturespeech combinations.

Children use deictic gestures to convey object information (e.g. point at mommy to

refer to mommy) and iconic gestures to convey predicate information (e.g. fist pounding in

S. Ozcalskan, S. Goldin-Meadow / Cognition 96 (2005) B101B113 B103
the air to refer to the act of hitting).
1

These gestures could be added to words to build more

complex meanings. For example, a child could produce a point at a peg along with the

word mommy to request mommy to act on the peg, thus conveying two arguments of a

simple proposition (the agent mommy in speech, and the patient peg in gesture). Or, the

child could produce an iconic hit gesture along with the word mommy to make the same

request, this time conveying the predicate and argument of the proposition (the action hit

in gesture, and the agent mommy in speech). If gesturespeech combinations are

precursors to linguistic constructions, we might expect children to produce argumentC
argument and predicateCargument combinations across gesture and speech before they
produce these combinations within speech (mommy peg, mommy hit).

More convincing still, children ought to be able to take the next step toward sentence

constructioncomplex constructions containing two predicatesin gesturespeech

combinations before taking this step in speech on its own. For example, a child who

produces an iconic hit gesture along with the sentence help me has, in effect, produced a

two-predicate construction, the help predicate in speech and the hit predicate in gesture.

Do children produce predicateCpredicate constructions in gesturespeech combinations
before they produce them entirely in speech (help me hit)? Does gesture continue to

predict the childs next linguistic steps?

To examine the role that gesturespeech combinations play in early language learning,

we observed 40 children as they progressed from one-word to multi-word speech. Our

question was whether the types of gesturespeech combinations that the children produced

presage oncoming changes in their speech and thus serve as a forerunner of linguistic

advances.
2. Method
2.1. Sample and data collection

Forty children (21 girls, 19 boys) were videotaped in their homes at 14, 18, and 22

months while interacting with their primary caregivers. The childrens families were

representative of the population in the greater Chicago area in terms of ethnic composition

and income distribution (see Table 1), and children were being raised as monolingual

English speakers. Each session lasted 90 minutes, and caregivers were asked to interact

with their children as they normally would and ignore the experimenter. Sessions typically
1
We followed Goldin-Meadow and Mylander (1984) in relying on gesture form (which, in our data, was

primarily action-based and, only occasionally, attribute-based) in assigning meaning to iconic gestures. In most

instances, the decision was bolstered by context. Take, for example, the hit gesture mentioned in the text. The

child and the mother were playing with a toy containing pegs of different colors and a plastic hammer. Initially,

the child was hammering the pegs while mother told him which pegs to hammer (e.g. hammer the blue one,

hammer the purple one). Later, the child handed the plastic hammer to mother and said you while producing

an iconic hit gesture. The child seemed to be using his iconic gesture to tell his mother to hit the peg and was thus

conveying an action meaning with his gesture.

Table 1

The sample of children classified according to ethnicity and family income

Family income Parents ethnicity

African-

American

Asian Caucasian Hispanic Mixed Total

Less than $15,000 1 0 1 0 0 2

$15,000$34,999 2 1 2 2 1 8

$35,000$49,999 2 0 3 0 1 6

$50,000$74,999 2 1 6 0 0 9

$75,000$99,999 1 0 5 1 0 7

$100,000 or more 0 0 7 0 1 8

Total 8 2 24 3 3 40

Mixed, two or more ethnic groups.

S. Ozcalskan, S. Goldin-Meadow / Cognition 96 (2005) B101B113B104
involved free play with toys, book reading with the caregiver, and a meal or snack time,

but also varied depending on the preferences of the caregiver.
2.2. Procedure for data analysis

All meaningful sounds and communicative gestures were transcribed. Hand move-

ments were considered communicative gestures if they were used to convey information to

a listener and did not involve direct manipulation of objects (e.g. banging a peg) or a

ritualized game (e.g. itsy-bitsy spider). Sounds were