Researchers at Carnegie Mellon and Stockholm Universities have used a video game featuring an "alien" language to recreate the challenges infants face when learning a language for the first time.
Lori Holt and Sung-Joo Lim from Carnegie Mellon University along with Francisco Lacerda from Stockholm University are making use of video game technology combined with a distorted, made-up "alien" language to study how spoken sounds are decoded by the brain.
The unintelligible, distorted speech was the only narration in the game, and provided the only instructions for the players. After just two hours, participants found they could pick up recognizable word-length sounds from the garbled speech, and decipher their meanings to advance through the game.
"Traditionally, when we study adult learning in the lab, it's nothing like how infants learn language," said Holt, who is a professor of psychology at Carnegie Mellon. "This video game models for adults the challenge language learning poses to infants. This presents the opportunity to study learning in ways that are just not feasible with infants."
The study represents an interesting opportunity, too, since language acquisition for adults is typically much more difficult than for the more receptive brains of infants. The researchers also hope that the results will help to understand and treat conditions such as dyslexia, as well as improve the efficiency of second language learning for adults. In the long-term, Holt intends to use functional magnetic resonance imaging to view participants' real-time brain reactions to the game as they decipher the alien language.
The researchers intend to present their findings at the Acoustical Society of America's annual meeting which takes place between May 23 and 27 in Seattle. You can find out more about Carnegie Mellon's brain research here.