Before babies speak, they may as well be aliens. For the first year or so, these fresh arrivals to planet Earth can’t say a thing for themselves. Then comes that first word — mama, dada, ball, bottle, uppy. It’s a milestone celebrated across cultures and YouTube, because, suddenly, they’re one of us.
But as Sandra Waxman, a developmental psychologist at Northwestern University, explains to Science of Us, there are way more linguistic and cognitive things happening in the minds of babes than their inarticulate burbles suggest. It’s difficult for us adults to appreciate, since language and thought are so often conflated. If you’re “thinking through” a decision with a friend, for instance, you’re talking about your problems, and in doing so, discovering what you’ve overlooked. For another example, consider how stupid people look when they don’t speak or write well (or over use big words). But just because it’s hard to relate to infants doesn’t mean that “there isn’t thinking at a preverbal level,” Waxman says. Though they present as pukey messes, infants have been attuning to speech even before they’re born: New research finds that fetuses will respond to their mothers’ speech — for example, opening their mouths when mom says “la” in singsong nursery rhyme — at just 25 weeks of gestation.
After birth, language starts linking up with thought quickly. Not only do three-month-olds like to hear human language more than any other sound, they’re also already being mentally stimulated by it. You can see it, Waxman says, in the nascent form of what linguists call “categorization,” or how people perceive their worlds to be structured. It’s how humans fit new objects into an ongoing taxonomy of what life throws at them, and it starts young. Early in childhood, kids are identifying commonalities among objects and classifying them into groups, with more or less success. For example, Waxman says, if you were to take a 12-month-old — the age range she studied earlier in her career — to the zoo for the first time, when their vocabulary is “dog” and maybe two other words, they don’t say “peas” or “uppy” when they see an aardvark; they say “doggie doggie.” This inaccuracy, Waxman says, speaks to developmental, category-making brilliance: Having seen dogs of different shapes and sizes in a year of life, the toddler is placing four-legged, snouted creatures underneath the mental classification of dog. The toddler is saying the aardvark is a dog not because the kid’s an idiot, but because she doesn’t yet know the conceptual boundaries between dogs and other cute mammalian quadrupeds. It’s similar to how, by seven months old, babies are starting to draw out analogies about the world — like there’s sameness between two Elmo dolls — as other Northwestern research has shown.
But to study the cognition of what Waxman calls “tinies” — three- and four-month-olds — you need to get creative. That’s why, since the 1950s, developmental psychologists have used looking time as a proxy for infant cognition — with the assumption that what a three-month-old is looking at is what it’s thinking about. In her more recent work, Waxman has found that human speech acts like a power-up for infant cognition. In a 2010 study, Waxman and her colleagues found that infants’ ability to classify objects into groups increases more when babies are exposed to human speech than if listening to sine wave tones (think the beeps and boots of R2D2 or other robots). Waxman’s team tracked how long 46 newborns, held in their mothers’ laps, looked at different illustrations (in this case, dinosaur and fish). After listening to speech, the tiny humans were more likely to show signs of categorization by the way they attended to the dinosaurs or fish. (Three-month-olds preferred new animals of a familiar category, while four-month-olds preferred animals of a novel category; developmental psychologists say this is because the four-month-olds are already exposed — read: bored with — the familiar category, so they seek the novel one.) Before the babies could even roll over in their cribs or parse individuals words from the stream of human language, just listening to language bumped up their cognitive abilities.
The link between listening and recognition, Waxman says, came to her after reading a study about Nile crocodiles: Baby crocodiles, it seems, make a screeching sound when they’re hatching. In the study that Waxman read about, biologists played a recording of the sound to eggs, and lo and behold, the babies hurried up to hatch faster and the mothers unearthed more eggs that they had buried in the sand. That captured Waxman’s imagination: If one animal’s vocalization prompted behavior in its family members, then maybe humans are a similar case. While we are distant evolutionary cousins, Waxman’s working hypothesis is that something similar is happening in people: Speech piques the infant’s attention, and with that increased attention, she says, babies have the mental bandwidth to recognize categories.
In a 2013 follow-up to the tone study, Waxman and her colleagues found that for three-month-olds, vocalizations from both humans and non-human primates (in this case, lemurs) increased their ability to recognize categories, but by six months, it was just human speech that prompted the change — showing that in a handful of months, babies are homing in on human speech. Then, in 2015, Waxman found that if you “teach” babies that tones are meaningful — by showing them a video of two people talking with one speaking English and the other speaking in beeps — the babies get the same attention-breeds-cognition bonus when hearing tones as they do speech. It’s a demonstration of how babies soak up human communication: If you show babies that robotic beeps and boops are something that humans relate to, then you can make them believe that tones will help them concentrate. “Babies are exquisite at inferring the communicative intention of others,” says Waxman. It’s an early sign of what might be humanity’s greatest asset: the profound flexibility of the mind.
The preverbal little ones have even more going for them than that: Research suggests that seven-month-olds can infer what other people’s perspectives of objects are — called “theory of mind” in the literature — and that they start distinguishing between animate and inanimate objects as early as 12 weeks old. Their sensitivity to people, their ability to form categories, their noticing of relations between objects — all of them are there in “a skeletal form” before babies can talk, she says. “It’s like if you ever took French in high school — you could understand a lot more than you could ever hope to say,” Waxman says. “You understand before you can produce.”