This research study investigates how hand gestures can support language comprehension and communication skills of hearing speaking, non-speaking, and/or minimally verbal individuals with Autism Spectrum Disorders (ASD), who are especially disadvantaged by the lack of accessible services in their rural communities. Individuals with other cognitive profiles, including Developmental Language Disorder (DLD), ADHD, Dyslexia, and others are welcome too. The study uses methods of eye tracking and recording of brain activity to understand how hand gestures adapted from signs from American Sign Language, such as \[cry\], can promote successful understanding of words like "cry". The overarching goal is to help families effectively utilize gestures to support communication with their children.
The investigators are researching how hand gestures and signs from American Sign Language influence language comprehension, and whether gestures/signs can support communication in children and adults with social and language impairments. The investigators use non-invasive brain imaging methods (skin-contact only). One is EEG - electroencephalography - which records electrical brain activity on scale of milliseconds. Another is fNIRS (functional near infrared spectroscopy) which shines infrared light through scalp and measures blood flow to the brain areas which are more active during a task. The researchers employ a protocol that uses both neuroimaging methods at the same time. This involves wearing a cloth helmet, and having a lot of wires attached to the surface of the helmet. The EEG wires need gel to be put into hair to pick up the brain signal. For psychological measurements, there are three rounds of tasks. One set of tasks about language and gestures happens as passive viewing of a computer screen with pictures of common objects and videos of a person gesturing and speaking. For example, see a picture for a dog, and hear "dog", and see gesture for "dog". There's no response required here. Another set of tasks is in picture books to evaluate language comprehension and IQ for solving puzzles, if the participant has capacity to engage with the picture books, if not - that's optional. Third task is 3-5 questionnaires for adults/caregivers about demographics, diagnoses (if any), experience with services, and autism, ADHD, and communication. All together the study takes at least 2-3 hours of research time per participant, plus breaks. Both children and adults are welcome participate.
Study Type
INTERVENTIONAL
Allocation
NA
Purpose
BASIC_SCIENCE
Masking
NONE
Enrollment
50
this is a pretest, test, posttest evaluation of language comprehension with and without gestures, to see if gestures can improve language comprehension
Montana State University
Bozeman, Montana, United States
RECRUITINGEEG/ERP brain response
In a pretest-test-postest set up, the researchers measure participants' EEG/ERP (electroencephalography/event related potentials) signals in microvolts first prior to exposure to gestures, just with visual animations and auditory words; then the investigators expose participants to visual gestures with auditory words; and finally participants are exposed to animations and auditory words again, to see if gestures contributed to any changes in EEG/ERP.
Time frame: from Enrollment to completion of study protocol within 3 months
fNIRS brain response
In a pretest-test-postest set up, the researchers measure participants' brain activation via blood flow response using fNIRS (functional near infrared spectroscopy). The investigators identify whether language or non-language brain areas are activated during animation, speech, or gestures.
Time frame: from enrollment to completion of protocol within 3 months
This platform is for informational purposes only and does not constitute medical advice. Always consult a qualified healthcare professional.