1superslp

Speech Pathology in New York City

Tablets to become key teaching tools for autism

Preview: Apps for autism
(CBS News)Ten-year-old Nuno Timoteo, an autistic child who does not speak, was thought to have the intelligence and attention span of a two-year-old until teachers put an iPad in his hands and learned he loved opera and classical music. Joshua Hood, 27, also non-verbal and autistic, was thought to understand much of his world, but his lack of speech frustrated him and all around him until he began communicating freely with a touch-screen tablet computer.

 

 

Nuno, Joshua and others whose autism prevents normal speech have made these breakthroughs with the help of tablet computers and special applications that allow them to communicate, some for the first time. Lesley Stahl reports on this new tool for understanding autism for a “60 Minutes” segment to be broadcast on Sunday, Oct. 23 at 7 p.m. ET/PT.

 

Ian Stuart, a special education teacher at the Beverley School in Toronto, works with Nuno and participated in a University of Toronto study to determine how effective the tablet computers can be with autistic students. He believes the touch-screen technology is fast becoming a crucial tool. He used a vocabulary app on the iPad to prompt Nuno with images that Stuart soon learned to his surprise the child knew by name: he could point to the soldier, the saxophone and wind chime when prompted by the words for them. Stuart says he “had no idea to the extent of his vocabulary.” But Nuno was even smarter than that. Shown a group of images with an apple and a few sweet treats and asked to point to the healthy snack, the child picked the apple.

 

Stuart says not every student takes to the device, but for others like Nuno, it’s almost a miracle. “Not all of them are going to be engaged by it the same way, but the ones who are engaged by it, it’s really…amazing,” he tells Stahl.

 

Another Beverley teacher, Sabrina Morey, says teachers sense there is more going on in their autistic students minds than they are able to communicate. “[Tablet computers] are giving us a tool to really prove that there is more happening.”

 

There was never a doubt that Hood knew many things, but his non-verbal manner made him dependent on others, who often did not know what he wanted or was thinking. With an iPad in his hands and the right applications, Stahl watches him order food in a restaurant, tell him his feelings toward his brother, or say he’s happy to be featured on “60 Minutes.” Says his mother, Nancy Hood, “The day he started using [the iPad], it blew me away…I wouldn’t have known he preferred Coke to Pepsi. He’s part of the community…communication is the essence of being human and here he is communicating fully now,” she tells Stahl.

 

Stahl’s story also features an interview with a University of Pittsburgh neuroscientist who is delving into the mystery of why more than 30 percent of autistic people cannot speak.

Advertisements

Toddlers and Technology: Teaching the Techniques

by Lisa Luna DeCurtis & Dawn Ferrer
As speech-language pathologists specializing in the communication development of children birth to age 5, we are accustomed to adapting materials and traditional educational tools to help children maximize their communication skills. We teach parents how to turn toys into language stimulation opportunities, and how the toy itself acts as a conduit for interaction. With the current wave of popular new and accessible technology, such as smart phones and easy-to-use tablet computers we are once again presented with an opportunity to introduce a tool as a viable and exciting resource for language and learning.

These tools are equipped with touch screen technology and are extremely durable, making them enticing to younger and less dexterous users. They provide immediate access to a world of learning, entertainment, and creativity for younger children, including those with more specialized needs. Therefore, SLPs working in early intervention and preschool have the opportunity to learn, understand, and use these tools by applying evidence-based and appropriate methods that focus on the interaction that occurs when building communication. Most importantly, SLPs have the responsibility to teach parents and families that it’s not the technology that builds communication, but rather the techniques used that stimulate and create connections, which become the foundation for communication.

Apps for Everything
SLPs working in early intervention can play a vital role in helping family and team members gain the maximum benefit of these newer teaching tools. The specific applications, known as “apps,” created for mobile devices such as Apple’s iPod Touch, iPhone, iPad and iPad2, and Google’s Android, have introduced immediately accessible activities for use in treatment and at home. SLPs are learning rapidly how to use this new wave of tablet technology most effectively, either through self motivation or because of encouragement from parents who are handing them to the therapist expecting him or her to know how to enhance their child’s growth in communication, cognition, social skills, and motor development. As children, of all ages and skill levels are drawn to a tablet and its educational apps without any specific training, it becomes crucial for parents and team members to understand how to address therapeutic goals and not let the tablet computer mainly become a source of entertainment.

Although not originally intended as a tool for children with special needs, the tablet computer is being compared to other augmentative and alternative communication (AAC) devices in various media reports The iPad was highlighted in the Wall Street Journal as an effective and more affordable communication tool for a 2-year-old with cerebral palsy (Valentino-Devries, 2010) while the iPad was hailed in the San Francisco Weekly as enhancing a child’s communication and cognition in ways unexpected by his parents and teachers (Harrell, 2010) The mobile devices seem intuitive to children who can pick one up, press a button, and begin manipulating an app with very little direct teaching or modeling.

Sandra Calvert, a professor of psychology and director of the Children’s Digital Media Center at Georgetown University, explained that the tablet computer interface maps onto how young children already think and perform tasks, including their early action-based learning and iconic and symbolic representation skills (Baute, 2010).

Child psychologist Jean Piaget (1998) described four stages of cognitive development that explains how children understand and assimilate new information. These stages support why a tablet computer appears to be intuitive for young children. His first sensorimotor stage ranging from birth to age 2 emphasized a primary means of learning through a child’s senses and motor skills. Today, children are visually drawn to the apps’ colorful illustrations and engaging musical accompaniment, and they explore the app through their sense of touch. Successful navigation of the apps involve gross- and fine-motor actions including pressing a button to activate it, isolating one or two fingers for pointing to or tapping on the screen, holding one or more fingers down to drag items around, using two fingers to zoom in and out, or sweeping a finger across the screen. Children also learn to manipulate the app by rotating, tilting, and shaking the device itself. The cumulative motor movements provide access to stimulating activities which offer immediate sensory and cognitive stimulation.

Piaget’s preoperational stage from ages 2-7 described a child’s ability to use symbols to represent objects, personify objects, and think about things and events that aren’t immediately present. This allows young children to interact with the app and experience a virtual world while understanding and distinguishing this new style of imagination and pretend play. Because sensory input is a primary form of learning for toddlers, the interactive nature of a tablet and a SLP may offer a highly interesting and motivating visual and tactile experience that promotes learning readiness. The communication-building skills for which we have used the tablet include the following:

Joint attention (e.g., looking for a bee who pops out from different places in Kezza Bee Peekaboo app).
Visual scanning (e.g., following the colored dots that can be popped in Color Dots app).
Vocal imitation (e.g., Talking Gina or Talking Tom apps that imitate your every sound or word, or Singing Fingers which records your voices as you draw on the screen).
Taking turns (e.g., apps such as Wipe & Learn or Build-It-Up allow for each child to wipe to reveal a photo or virtually stack a toy).
Following directions (e.g., any of the Cake or Cookie Doodle type apps give ample opportunities to follow directions for both the preliterate and literate youngsters).
Picture association (e.g., Baby Touch and Hear Lite app allows the child to select a picture, hear and see the name of the picture and the associated sound).
Sound association (e.g., I Hear Ewe has the a field of 6-8 pictures which you can tap and hear a sound while keeping the screen facing away from the child and then turning the screen towards the child to select the picture which matches the sound).
Vocabulary building (e.g., The Wheels on the Bus app is a favorite just like the song for teaching early lexicons of nouns, verbs, and prepositions as is Verbs With Milo app).
Increasing expressive language length and complexity (e.g., Sequecning With Milo and Prepositions With Milo apps are great for expanding sentence length and complexity as the child sees Milo the mouse perform different acitivities.
Stimulating spontaneous and novel utterances (e.g., Toca Tea Party and Toca Doctor are two apps that have child experience activities which stimulate questions and comments).
Reinforcing and generalizing cognitive concepts such as sequencing and categorizing (e.g., Monkey Preschool Lunchbox app includes matching for color and size, putting together simple puzzles, 1:1 counting, and categorizing objects).
Teaching pre-literacy and early math skills (e.g., The Dr. Suess book apps allow the child to tap on a word or picture and hear and see the word as it is highlighted).
There are thousands of apps that can be used with toddlers to target individual therapeutic goals or to simply enhance language learning.

Implementation
The implementation of apps as a teaching tool with young children from ages 1 to 5 has shown to be effective in building speech, language, and social skills with a wide array of our clients’ communication disorders, such as autism spectrum disorders (ASD), Down syndrome; specific language impairment, auditory processing disorders, and apraxia.

In each session, The key is that we incorporate a tablet computer and selected apps while continuing the use of other integrating communication-building methods, such as Greenspan’s Floortime ™, the language development programs of Hanen ®, and Gutstein’s Relationship Development Intervention (RDI). These programs all focus on parents and family members as language facilitators, an approach that allows families to carry over the techniques to reach the child’s goals with their own tech tools. They also focus on taking cues from the child by observing his or her expression, and waiting for a verbal or nonverbal response, then building on the interaction to extend the activity to its next natural step. All of the programs focus on the interaction between family member and the child as the main teaching tool, not on the toy or the tablet computer.

More specifically, he Hanen® program (Pepper & Weitzman (2004) teaches parents to allow the children to lead interactions, adapt to “share the moment,” and add language and experience to improve their child’s communication skills. When using a language-stimulation apps, such as “Monkey Preschool Lunchbox” or “Wheels on the Bus,” the parent can apply the tenets of the Hanen® program by allowing the child to lead (e.g., he/she taps the screen on the object of interest), while the parent applies language by naming the objects and the actions of the child and those on the screen. Furthermore, the parent can expand the child’s utterances and comment on the child’s experience while engaging in a dynamic and novel activity

The interactive nature of many apps also can be illustrated by singing songs together, imitating each other’s drawings, or building entertaining stories. An adult can get down on the floor with a toddler while exploring a simple but visually stimulating app (e.g., “Tesla Toy”), or have the toddler sit facing the adult while playing an imitation game (e.g., “Talking Tom”), or sit side-by-side while developing fine motor skills (e.g., “We Doodle”). Apps such as “Barney the Dog”and “I Close My Eyes” can be used to encourage description and storytelling. Apps that record a child’s voice (e.g., “Talking Tom,” “Wheels on the Bus”) can be used to encourage children to vocalize more; the apps with sound-letter correspondence can be used to teach the child letter-matching and sound production (e.g., “First Words Shapes”). As many children with ASDs tune in to the visual stimuli, apps may become a tool to encourage engagement, and can serve as a start-up or back-up activity (e.g., the SLP can preload a tablet with apps the child finds interesting). Even better, if an SLP has wireless access to the Internet, she or he can immediately download additional apps that will suit the needs of that session.

Some Pointers
From our anecdotal experience, the most positive treatment outcomes have come from considering the “The 7 Ps of Using Mobile Technology in Therapy” (DeCurtis & Ferrer, 2011):

Preparation: What is the rationale for integrating a mobile device with a child versus traditional toys alone?
Participants: What is the child’s age and developmental level and should this device be used individually or in a group?
Parameters: How much time will be spent integrating the device and which environments will yield the best results?
Purpose: What is the advertised purpose of the app and how can it meet your client’s individual goals?
Positioning: What are the effects of sitting side-by-side versus face-to-face and would the child prefer to be at the table or on the floor or on a lap?
Playtime: How will you incorporate the child’s preferred style of play with the device and how will you experience shared enjoyment?
Potential: How will you extend and expand the learning gained from using an app to real-life experiences? Where will you and the family anchor the knowledge gained from the app to what the child already knows?
After a year of integrating a tablet into therapy with young children and guiding their families, these are some beneficial strategies to integrating mobile technology that focuses on the therapist’s and parent’s technique:

Introduce the tablet by positioning it toward you to gain the child’s auditory attention before turning it to the child and adding the visual stimulation. Hold the device up by your face to gain the child’s attention. The similar size of a tablet computer allows for quick referencing from the device to the adult and back.
Encourage holding the tablet below your face and in front of you to show the child how an app works and allow for integration of sequenced steps for later imitation.
Although a child’s natural tendency is to touch the tablet, don’t let the child touch it when introducing an app so the child can truly focus on observing and processing the adult’s actions.
As the child receives and processes auditory input from the app’s sounds, listen to the child’s expressive communication and observe how the child shows interest. Build on the child’s initiation.
Look for ways to extend interactions by a variety of means, such as introducing a real toy associated with a virtual character, by imitating a character’s movements, or by adding another direction from the app that it didn’t offer, such as story retelling.
Connect, Direct, Reflect: Make a meaningful social connection with the child first, followed by subtly directing the interaction based on the child’s initiations with the mobile device, and conclude with both participants reflecting on what they learned with the app or carrying the skill over to another activity.
Not Always the Answer
Working with apps is not meant to replace the feel and sound of book pages turning, the sensation of applying a crayon to a piece of paper, or the satisfying crash of a tower of blocks smashing to the floor. Nor is it meant to substitute to face-to-face interaction with young children. Dr. Sally Rogers, a MIND Institute researcher at U.C. Davis Medical Center, explained that experiences shape babies’ brains in a very physical way; if a baby focuses on objects more than on faces, babies can lose their ability to learn the emotional cues normally taught by watching facial expressions (Dembosky, 2010). Mobile technology, particularly tablet computers, is best utilized as a way to enhance early therapeutic intervention methods. Tablets offer convenience and portability while allowing immediate and inexpensive (or even free) access to a variety of engaging activities.

Also, SLPs can stress to parents that quality, not the quantity, of time that is powerful. The American Academy of Pediatrics (Glassy & Romano, 2007) encourages parents to limit video game and computer game use so that total screen time, including television and computer use, should be less than one to two hours per day. Children younger than 5 years should play with computer or video games only if the games are developmentally appropriate, and should be accompanied by a parent or caregiver for maximum benefit. Warren Buckleitner (n.d.) describes healthy ways to embrace technology by bringing balance into a child’s media diet using a few sensible strategies, such as keeping devices out of a child’s bedroom and setting “media-free” time, especially during meals.

Based on rapid and ongoing sales of mobile technology, specifically the tablet computers, the media’s current focus on educational benefits of apps, and the increasing use of iPads in schools (Malone, 2011), SLPs who work with toddlers and families can lend their expertise as facilitators who model tablet use. However, Sandra Calvert explained how many people “are rushing to get content and it hasn’t really been empirically tested…What we see is a lot of promise, and informal observations to suggest kids are very engaged” (Malone, 2011). Therefore, proceed with caution and wisdom, remembering that person-to-person interaction, individualized treatment goals, and tried-and-true therapy techniques, will always be more important than the latest tool on the market.

Lisa Luna DeCurtis, MA, CCC-SLP, owns a private practice in the San Francisco Bay area coaching families to improve social communication skills and focusing on bilingual development She is Co-owner of Morning2Moon Productions. Contact her at lldecurtis@speakeasy.net

Dawn Ferrer, MS, SLP, owns a private practice in the San Francisco Bay area and is clinicial coordinator at Abilities United, She works with young children and their families to improve communications skills. She is co-owner of Morning2Moon Productions. Contact her at dawnferrer@sbcglobal.net.

DeCurtis, L. L. & Ferrer, D. (2011, September 20). Toddlers and Technology: Teaching the Techniques. The ASHA Leader.

New Speech Therapy Clinic Coming to Harlem NY!!

These are exciting times for Innovative Therapy Solutions!! We will be expanding to open a speech therapy clinic in Harlem, NY Fall 2011!! Please stay tuned for Grand Opening information. Tell a friend to tell a friend :).

Hearing Bilingual: How Babies Sort Out Language

Hearing Bilingual: How Babies Sort Out Language
By PERRI KLASS, M.D.
Published: October 10, 2011

Once, experts feared that young children exposed to more than one language would suffer “language confusion,” which might delay their speech development. Today, parents often are urged to capitalize on that early knack for acquiring language. Upscale schools market themselves with promises of deep immersion in Spanish — or Mandarin — for everyone, starting in kindergarten or even before.
Enlarge This Image

Joyce Hesselberth
Yet while many parents recognize the utility of a second language, families bringing up children in non-English-speaking households, or trying to juggle two languages at home, are often desperate for information. And while the study of bilingual development has refuted those early fears about confusion and delay, there aren’t many research-based guidelines about the very early years and the best strategies for producing a happily bilingual child.

But there is more and more research to draw on, reaching back to infancy and even to the womb. As the relatively new science of bilingualism pushes back to the origins of speech and language, scientists are teasing out the earliest differences between brains exposed to one language and brains exposed to two.

Researchers have found ways to analyze infant behavior — where babies turn their gazes, how long they pay attention — to help figure out infant perceptions of sounds and words and languages, of what is familiar and what is unfamiliar to them. Now, analyzing the neurologic activity of babies’ brains as they hear language, and then comparing those early responses with the words that those children learn as they get older, is helping explain not just how the early brain listens to language, but how listening shapes the early brain.

Recently, researchers at the University of Washington used measures of electrical brain responses to compare so-called monolingual infants, from homes in which one language was spoken, to bilingual infants exposed to two languages. Of course, since the subjects of the study, adorable in their infant-size EEG caps, ranged from 6 months to 12 months of age, they weren’t producing many words in any language.

Still, the researchers found that at 6 months, the monolingual infants could discriminate between phonetic sounds, whether they were uttered in the language they were used to hearing or in another language not spoken in their homes. By 10 to 12 months, however, monolingual babies were no longer detecting sounds in the second language, only in the language they usually heard.

The researchers suggested that this represents a process of “neural commitment,” in which the infant brain wires itself to understand one language and its sounds.

In contrast, the bilingual infants followed a different developmental trajectory. At 6 to 9 months, they did not detect differences in phonetic sounds in either language, but when they were older — 10 to 12 months — they were able to discriminate sounds in both.

“What the study demonstrates is that the variability in bilingual babies’ experience keeps them open,” said Dr. Patricia Kuhl, co-director of the Institute for Learning and Brain Sciences at the University of Washington and one of the authors of the study. “They do not show the perceptual narrowing as soon as monolingual babies do. It’s another piece of evidence that what you experience shapes the brain.”

The learning of language — and the effects on the brain of the language we hear — may begin even earlier than 6 months of age.

Janet Werker, a professor of psychology at the University of British Columbia, studies how babies perceive language and how that shapes their learning. Even in the womb, she said, babies are exposed to the rhythms and sounds of language, and newborns have been shown to prefer languages rhythmically similar to the one they’ve heard during fetal development.

In one recent study, Dr. Werker and her collaborators showed that babies born to bilingual mothers not only prefer both of those languages over others — but are also able to register that the two languages are different.

In addition to this ability to use rhythmic sound to discriminate between languages, Dr. Werker has studied other strategies that infants use as they grow, showing how their brains use different kinds of perception to learn languages, and also to keep them separate.

In a study of older infants shown silent videotapes of adults speaking, 4-month-olds could distinguish different languages visually by watching mouth and facial motions and responded with interest when the language changed. By 8 months, though, the monolingual infants were no longer responding to the difference in languages in these silent movies, while the bilingual infants continued to be engaged.

“For a baby who’s growing up bilingual, it’s like, ‘Hey, this is important information,’ ” Dr. Werker said.

Over the past decade, Ellen Bialystok, a distinguished research professor of psychology at York University in Toronto, has shown that bilingual children develop crucial skills in addition to their double vocabularies, learning different ways to solve logic problems or to handle multitasking, skills that are often considered part of the brain’s so-called executive function.

These higher-level cognitive abilities are localized to the frontal and prefrontal cortex in the brain. “Overwhelmingly, children who are bilingual from early on have precocious development of executive function,” Dr. Bialystok said.

Dr. Kuhl calls bilingual babies “more cognitively flexible” than monolingual infants. Her research group is examining infant brains with an even newer imaging device, magnetoencephalography, or MEG, which combines an M.R.I. scan with a recording of magnetic field changes as the brain transmits information.

Dr. Kuhl describes the device as looking like a “hair dryer from Mars,” and she hopes that it will help explore the question of why babies learn language from people, but not from screens.

Previous research by her group showed that exposing English-language infants in Seattle to someone speaking to them in Mandarin helped those babies preserve the ability to discriminate Chinese language sounds, but when the same “dose” of Mandarin was delivered by a television program or an audiotape, the babies learned nothing.

“This special mapping that babies seem to do with language happens in a social setting,” Dr. Kuhl said. “They need to be face to face, interacting with other people. The brain is turned on in a unique way.”

A version of this article appeared in print on October 11, 2011, on page D5 of the New York edition with the headline: Hearing Bilingual: How Babies Sort Out Language.

Professor’s Response to a Stutterer – Don’t Speak – NYTimes.com

Professor’s Response to a Stutterer – Don’t Speak – NYTimes.com.

Technology in Spl Education » FREE Apps/Resources

I get asked constantly what Apps I am using during therapy. I LOVE to share Free helpful apps that I hear about from various outlets. By far Momswithapps is my number one resource for free educational apps, Technology in Spl Education is a very close second. I love when great resources are shared! Most of all I love when they are FREE! 🙂 Happy downloading.

via Technology in Spl Education » FREE Apps/Resources.

Ask not what your child can learn from you, but what you can learn from your child!

Ask not what your child can learn from you, but what you can learn from your child!

by MOMS WITH APPS on OCTOBER 3, 2011

Our feature this week is written by the founders of Talking Wizard – two speech language pathologists with a combined total of 20 years of experience working with children. Together they created Splingo’s Language Universe which was featured previously on our App Friday program. Eleanor talks about tuning into our children, observing what interests them, and factoring their cues into learning experiences that will be fun and engaging for teacher and student. 

Let them teach you

There are days when I actually can’t believe someone pays me to do my job. Don’t get me wrong, there are also days when I think there must be easier ways to earn money, but generally the former is true. These are the days when I’ve spent time messing about with shaving foam, Jell-o and bubbles – that beats an office job any day! As a children’s Speech and Language Pathologist I have found the key to teaching any skill is to let the child teach you how to teach them. Quite literally in the case of one of my students, whose favourite ‘reward’ in therapy was to turn the tables and do therapy right back to me! It certainly helped me to reflect on my practice!

Look for their interests

With most children the lesson isn’t quite so obvious. Having worked with the whole range of children from typically developing kids with minor speech difficulties, to non-verbal children with learning difficulties and Autism, there’s always something to be learned if you look closely. It might be the way they gradually disappear under the table limbo-style, as though to say “I really need you to teach me in a comfier chair, so I don’t have to work so hard to sit still”. It might be the way their eyes are magnetized to a toy on the shelf behind you, telling you “that’s what I really want to be doing right now; that’s the thing that motivates me”. Or it could be the way they bounce, spin, squeal or flap which says “this is what I’m interested in; this is the key to helping me learn”. Children are actually very good at teaching us what they need in order to help them learn.

Become familiar with new educational tools

This was the inspiration behind Talking Wizard, a company formed together with another SLP. Our mission is to provide high-quality, motivating speech and language resources. We want to make use of all we’ve learned from the children in our lives, both in and outside of work, to develop the kinds of resources for learning that kids would invent themselves. The biggest lesson we have learned from children is how important technology is to them. Even something as simple as using a calculator function on a basic model cell phone to teach number recognition is so much more enticing than the traditional means of learning.

Direct them to learning by listening first

We started at the beginning with the most fundamental skill upon which learning is built; the ability to listen and understand language. Our first App, Splingo’s Language Universe, has allowed us to combine our clinical knowledge about how children develop listening and language skills with the insights we have been given into how children like to learn. We’ve consulted our ‘teachers’ at every stage of the process and the feedback from the whole spectrum of children in our acquaintance has been immensely useful and at times very entertaining!

Our moments of triumph have ranged from being patted down by students in search of “the alien game”, to recent feedback from a fellow SLP about how a non-verbal child with Autism and motor difficulties spent 40 minutes engaged with the game. Using a child’s natural fascinations as the vehicle for the learning process means that learning happens naturally as part of the fun. (Take our friend the Jell-o fan. He loves Jell-o, but math…not so much. He wasn’t the least bit interested in his learning objective to match numbers, but hide the numbers inside the Jell-o and bingo! Quite literally.)

As Plato put it: “Do not train a child to learn by force or harshness; but direct them to it by what amuses their minds, so that you may be better able to discover with accuracy the peculiar bent of the genius of each.” Though I’m not sure he had Jell-O in mind at the time.

Post Navigation