Motion Capture Technology Helps Researchers Understand Speech Disorders In Children
If you’ve played a recent video game or movie, you’ll have seen motion capture in action. If you’ve ever watched a behind the scenes feature that showed motion capture in action, you’d see little white dots all over the body of the actor or whomever was doing the motion capturing. This allows computer generated images to be rendered more lifelike, as seen in games like Uncharted, The Last of Us, and NBA2k15, as well as movies like Avatar and Planet of the Apes. Entertainment is fun and all, but what if this motion capture technology was used to help people with real-life problems?
A new study conducted by NYU’s Steinhardt School of Culture, Education, and Human Development used facial motion capture on children with childhood apraxia of speech and on those with other types of speech disorders to identify the differences between the two.
"In our study, we see evidence of a movement deficit in children with apraxia of speech, but more importantly, aspects of their speech movements look different from children with other speech disorders," said study author Maria Grigos, associate professor in the Department of Communicative Sciences and Disorders at NYU Steinhardt.
Childhood apraxia of speech is a speech impairment in which children have a hard time planning and making accurate movements to create speech sounds. These children are usually delayed in developing speech, make slow progress in speech therapy, and have atypical speech patterns.
Researchers placed tiny reflective markers on the child’s face and, using motion capture technology, were able to measure facial movements by watching how the lips and jaw moved. This adds a layer of understanding that isn’t apparent if you’re just listening to the speaker.
"This research enables us to look at the movement patterns used to produce a word in relation to the way that word is perceived. Including the perceptual component is key because as clinicians, we rely heavily on the judgments we make when listening to children speak." Grigos said. "One of our aims was to determine if we could identify differences in how the lips and jaw move even when speech is perceived to be accurate by the listener."
The researchers examined the lip and jaw movement of 33 children, age 3 to 7. The children were separated into three groups: 11 had childhood speech apraxia, 11 children had other speech impairments and 11 children didn’t have any speech problems at all. The children were then asked to say one, two, and three syllable words, while the motion capture technology recorded the movement of jaw, lower lip, and upper lip.
The researchers used the motion capture technology to pick up on subtle differences that they couldn’t hear. The children with apraxia of speech created lip and jaw movements that differed from the other two groups of children. The timing of movement was also delayed in the two groups with speech problems, which meant they took longer to produce words.
When the children were asked to pronounce the three-syllable words, the children with the speech problems handled the words much more differently, with the apraxia of speech children struggling the most.
The study provided evidence that children with childhood apraxia of speech differ greatly from children with other speech problems, while children respond differently to linguistic challenges depending on what their specific speech problem is.
Source: Grigos, M. Moss, A. Lu, Y. Oral Articulatory Control in Childhood Apraxia of Speech. Journal of Speech Language and Hearing Research. 2015.