“The massive scale of daily-life input”

To identify the properties of visual input in infants at approximately three to 13 months old, the researchers placed head-mounted video cameras on 10 infants and 10 of their adult caregivers, collecting and analyzing 70 hours of visual documentation of at-home daily life. Clear differences emerge between the contents of the infants and adults’ images with a higher concentration of simple patterns and high-contrast edges within the views of infants than in those of adults.

Smith infers that the reason for these views is not only that infants will turn their heads to look at the features of the world they can see, but that parents or caregivers are likely to put them in places where they like to look at things. “You have to think why they are where they are. There is probably some natural knowledge implicit on the part of parents to leave infants where like to look at things. Mom’s not gonna bother you if you’re not fussing,” she observes.

Whether the small group of participants from Bloomington, Indiana represents infants more broadly around the world is a question Smith’s lab and its collaborators are beginning to explore. They conducted the same experiment, for example, in a small, crowded fishing village in Chennai, India where electricity is minimal and much of daily life occurs outdoors. And while images from the head cameras of 6-month-olds and 12-month-olds in India look very different from their Bloomington counterparts, the youngest infants share a common “diet” of high-contrast edges and simple patterns in both Chennai and Bloomington.

Bigger pictures, past and future

Smith and her collaborators have also shown that the same images improve the training of AI visual systems. In a follow-up to the current study, for example, they found that if you train an AI system by first feeding it images characteristic of early infancy, it has greater success learning to identify visual images than if you feed it images in a random developmental order or simply provide images typical of an adult’s daily life. The more precise developmental sequence produced the best results.

Their work also opens new avenues for evolutionary speculation. “One of the things I always used to ask as a grad student,” says Smith, “and maybe we’re getting a chance to answer it – is why do human babies have such slow motor development. They spend about three months just listening and looking and another six months with a little bit of posture and head control. Why are they so slow? Horses come out and run races.”

This research suggests that “over evolutionary time these slow, incremental and optimized biases work to build up a very smart visual and auditory system. That’s a story that could be told,” she says.

 

Other researchers include IU Bloomington professors Rowan Candy in the School of Optometry and Jason Gold in the Department of Psychological and Brain Sciences.

LIZ ROSDEITCHER
Science Writer