Getting the word in

BU Deaf Studies researchers look for ways to prevent deaf children from being deprived of language

BU Experts
BU Experts
9 min readMar 22, 2017

--

A few years ago, Naomi Caselli, a Boston University Deaf Studies researcher, stumbled upon her father’s faded class picture from the 1960s. He stood in the back, a suited adolescent in a sea of elementary schoolchildren. Caselli assumed he was a teacher’s aide.

He wasn’t. Her father had been held back.

He had lagged behind his peers for a specific reason. Caselli’s father, Raymond Kenney, is profoundly deaf in both ears. He was in a class for deaf and hard-of-hearing children, but they didn’t teach sign language there. Instead, teachers spent years coaching him to speak using physical and visual cues. Over and over, they sounded out words like ‘ball,’ repeating ‘bah-bah-bah’ while holding his hand at their mouths to show him how to mimic the vibrations. “They really wanted him to speak,” says Caselli.

Raymond Kenney stands in the back row, far right, a bright adolescent in a class of elementary-age students. Kenney is deaf but had no access to sign language in school, so he had limited opportunities to advance. Photo courtesy of Naomi Caselli

By age eight, he knew a few basic words, but he could not speak in sentences and used made-up gestures to communicate with his family. He did learn to read, however, and excelled in math and science as a teen.

At age 19 at the National Technical Institute for the Deaf at the Rochester Institute of Technology, he got his first look at American Sign Language (ASL) in practice among other deaf students. “I was enthralled,” says Kenney, who immersed himself in learning the language. “A lightbulb went on.”

Today, he still struggles with reading and expressing himself in written English, frustrations that evoke his childhood, when his ability to communicate was so limited. “I’m still connecting to that anger,” says Kenney, who is now a Certified Deaf Interpreter.

“He’s a beautiful signer now,” says Caselli, who is hearing but learned ASL alongside spoken English from birth. “But that’s uncommon for people who’ve had a similar experience.”

Today, ASL has been accepted as a full-fledged language with all of the complexity, structure, syntax, and storytelling found in spoken languages. Also, elementary and secondary school programs for the deaf teach and assess ASL proficiency using accepted measurement tools, some of which were developed at BU.

But language deprivation remains a real problem for deaf children. “Kids today are still having much the same experience that my dad did,” says Caselli. “It is not a thing of the past.”

The concern now among researchers like Caselli and Amy Lieberman, assistant professor of deaf studies in the BU School of Education, is what happens before school starts. Approximately 90 to 95 percent of deaf children are born to hearing parents who often don’t know sign language and therefore will likely struggle to teach it before their children enter school. Even among school-aged deaf children, estimates based on data from a 2010 survey from Gallaudet University, which specializes in deaf education, suggest that at most 40 percent of families use sign language at home. Given this data, educators in the field worry that a majority of deaf children may be deprived of language.

Exposure to language from birth is essential for the development of thinking skills, according to a range of studies. Without access to language, children have a harder time in school. They also have more difficulty developing a sense of self and others. They even struggle with planning and time management. “It’s a constellation of challenges,” says Caselli.

So Caselli, Lieberman, and their collaborators are using the tools of linguistics, behavioral psychology, cognitive science, and education to understand how deaf children acquire language and, in turn, how best to teach them. “We desperately need data about this,” she says. “We are still at the beginning stages of identifying the most prevalent issues.”

Critical Moments

When Robert Hoffmeister founded the BU Deaf Studies program in 1980, ASL was on the fringes of acceptance. Students at BU could study it, but it did not count as a foreign language. “That’s the cloud we lived under for about 25 years,” says Hoffmeister, associate professor emeritus of deaf studies. “That cloud was more or less lifted by evidence-based research.”

Linguistic analyses showed that ASL is a language, not just a bunch of gestures. It uses space, coordinated handshapes and movements, facial expressions, and a unique syntax to build meaning. Together, this visual language has all the structural features found in spoken languages. It also has its own literary traditions. “We have folklore passed down from generation to generation,” says Bruce Bucci, a BU Deaf Studies instructor who is among several deaf faculty members at BU and communicated through an interpreter for this story. “There is a visual tradition and culture connected with the language.”

Acceptance of ASL as a language was a fundamental first step toward preventing language deprivation, because it validated the teaching of ASL to deaf babies and children. The developing brain responds to language no matter how it is presented, so exposure to ASL is equivalent to exposure to a spoken language. “The same brain regions and mechanisms perceive and acquire language regardless of the modality,” says Lieberman.

At most, 40 percent of families with school-aged deaf children use sign language at home.

Researchers also learned that language deprivation delays the development of thinking skills. In 2007, Hoffmeister and colleagues studied deaf children’s development of “theory of mind,” the human ability to think about other people’s thoughts. They found that children exposed to sign language from birth develop theory of mind apace with hearing children. But children with delayed language exposure also had delays in theory of mind. “You need language to talk about the world,” says Hoffmeister, who hears but is a child of deaf parents. “Language was the crucial factor.”

Hoffmeister went on to develop ways to assess language acquisition in school-aged children. Now research in the Deaf Studies program is shifting the focus to younger children, from birth to age five. This age range is known as the critical period of language development. During those years, exposure to language triggers all kinds of development. Pull the trigger, and children associate words with things, ideas, and feelings. They form a sense of self and others, an understanding of time and planning, and an ability to pay attention and make connections.

On the flip side, without language exposure, children experience a cascade of deficits. “If babies don’t have stimulation with language during that critical time of development, then their cognitive development, their thinking skills, and their language development are all at stake,” says Nicole Salamy, a speech and language pathologist at Boston Children’s Hospital who is also a deaf studies instructor at BU.

Bucci puts it more directly: “If children are deprived of language, they will not thrive.”

The problem is that for deaf children, language is visual. It’s not passively absorbed as the sounds of life occur around them. “When deaf children have access to visual language, they can navigate their world right away,” Salamy says.

Language Barriers

Hearing parents of deaf children face all of the challenges of parenthood plus the need to learn a completely new language for communicating with their child. They also face conflicting advice from health providers, associations, and educators.

Some advocacy and professional groups counsel against introducing sign language, particularly targeting parents who want to use medical interventions such as cochlear implants. Parents are told that sign language will distract their child, or that it will take up space in the brain and not leave room for learning spoken languages.

These and other concerns have largely been debunked. In a recent review of research on the subject, Caselli and her colleagues, Matthew Hall from the University of Connecticut and Wyatte Hall from theUniversity of Rochester Medical Center, show that learning ASL early supports learning a spoken language later, the same way learning one spoken language supports learning a second. “If you understand the structures of one language, you’ll be able to use and understand them in another,” she says.

There is also a notion that deaf children struggle with reading because they can’t sound out words. This connection between written language and sounds is called phonological coding. But according to research Lieberman did before she came to BU, this is also a misconception. “Many skilled deaf readers do not have access to phonological coding,” she says. “They clearly have alternate routes to reading, most likely having a foundation in sign language.”

Ultimately, there is no risk to introducing children to sign language. Research shows that the deaf child will only benefit, whether hearing and speech are introduced later or not. “You can do both,” says Caselli. “You can learn sign language and try to get spoken language.”

One of the biggest challenges for educators and researchers who want to improve deaf education is figuring out how to detect language deprivation. This would not only help researchers understand the scale of the problem but also help them guide deaf children and their parents to services that can smooth the way to introducing sign language.

90 to 95 percent of deaf children are born to hearing parents who often don’t know sign language.

A first step, being taken by Caselli, Lieberman, and Jennie Pyers, a visiting faculty member from Wellesley College, is to develop an ASL test for children under five. With new funding from the National Institute on Deafness and other Communication Disorders, part of the National Institutes of Health (NIH), they plan to work initially with deaf children who have deaf parents. “We want to sort out what vocabulary acquisition looks like under ideal conditions,” says Caselli.

ASL-LEX is an online visual database containing nearly 1,000 ASL signs. Caselli hopes it will become a repository for data on how and when children acquire sign language.

From there, they will study deaf children with hearing parents, who likely face bigger challenges and potential delays as parents learn to sign. “Our goal is to determine where children fall behind and where they don’t, so that we can focus interventions,” she says.

To support this effort, Caselli developed an online visual lexical database for ASL called ASL-LEX. The tool, which won the People’s Choice Award (Interactive Category) in the “Vizzies” Visualization Challenge sponsored by the National Science Foundation and Popular Science, documents nearly 1,000 ASL signs, along with information about frequency of use, grammar, and hand movements. The database will also become a repository for information about milestones, such as the age at which children learn different signs. This information, in turn, can become a source for building assessment tools.

Attention-Getters

For parents of deaf children, job one — aside from learning the language itself — is getting the child’s attention. “It seems simple, but parents need to learn how to manage their child’s gaze,” says Lieberman.

Deaf babies who learn sign language from their parents learn to manage their attention by the time they reach preschool, according to earlier research by Lieberman. “They look up to see a sign and down to connect the sign to an object,” she says. “They do so in meaningful and purposeful ways.”

Bruce Bucci holds his daughter Sophia (age one at the time) and teaches his daughter Isabella to sign “birthday” while celebrating her third birthday. Photo courtesy of Bruce Bucci

Since eye movements reveal a lot about how deaf children process and learn language, Lieberman developed a set of studies using techniques that track eye movements and is continuing this research with a grant from the NIH. She and her research team, which includes both deaf and hearing researchers, are focused on deaf children as young as 18 months and up to five years to understand how and when they learn words.

The study will include both deaf children with deaf parents and deaf children with hearing parents. “We want to look at the full spectrum of deaf children, looking carefully at the quantity and quality of language exposure they’re receiving,” says Lieberman. “How do those two measures correlate with the ability to develop visual attention skills and new words?”

Not only will this research help develop milestones for detecting language deprivation, it will also help develop interventions for children who are falling behind. An outcome could be an educational program, or tips for parents that help them manage their child’s gaze. “Without looking, there’s no language,” says Lieberman.

This story on ASL language acquisition originally posted at bu.edu.

For additional commentary by Boston University experts, follow us on Twitter at @BUexperts and on Instagram at @buexperts.

--

--

BU Experts
BU Experts

Cutting-edge research and commentary out of Boston University, home to Nobel laureates, Pulitzer winners and Guggenheim Scholars. Find an expert: bu.edu/experts