This is where I note that the bulk of my reading and learning is about AAC, and I'm not a literacy expert.
Here's one area of conflict: With regards to literacy, there is research that indicates that the pairing of words with icons (like in early reader books where there is a little picture of Dora directly above the word "Dora" in the sentence) is not beneficial, and may be (or is? I don't remember) actually detrimental. This makes sense, as icons would distract the reader from the words being read, and also kind of pull their eyes out of the left-to-right flow of the sentence.
The goal of AAC, though, is not to teach reading but rather to facilitate communication (although immersion in text-via-AAC seemed to accelerate Maya's reading ability). I imagine that there are apps/systems that only display text in their sentence strip, and there are others that display one or more icons along with their words. I imagine that "should the icons be displayed with the text in AAC" has been a question discussed and debated, particularly among those who develop these systems.
From what I've seen of AAC using/learning, it seems to me that having the icons displayed in the sentence strip along with the words being spoken makes AAC learning/using easier----particularly for young AAC users or for AAC users who are learning their system. (Side note: I, as a part-time user, am perpetually learning the system. I can't imagine when it will be effortless for me to use it, even with the automatic motor planning element. Even as I become more fluent with frequently-used words, we are constantly adding new words to the vocabulary.) While I certainly can't make any grand claims on behalf of all AAC users, I want to share what I've seen with my kids, who present as an interesting case study.
Background: In Speak for Yourself, a word can take one or two taps to say (no word takes more than two taps). When a word is selected it is spoken aloud and move to the sentence strip at the top of the screen. The two icons that were tapped to select the word are displayed under the word, as shown below.
"I" is a 1-hit word, and has 1 icon displayed. "corn" is a 2-hit word and has 2 icons displayed. First the user selects the initial icon, and that takes them to a secondary screen where they can find the second icon.
Will (2.5 years) and Maya (6.75 years): Maya has been using SFY for the past 3 years, Will has been using it for a little over a year. Both were obviously pre-literate at the time they started using the app. It's difficult for me to remember much of Maya's early AAC use (because I'm old, my memory is spotty, and I was just so excited that it was working that I wasn't scrutinizing much)---but now I am watching Will become an elective AAC user through an increasingly academic eye. So here are my take-aways, with a few minutes of video of a small experiment (taken yesterday).
First, a video of Will. For this experiment I selected two words that I knew he had never seen in SFY. I selected one (while the screen was out of his view) and then placed the talker in front of him to see if he was able to properly follow the icon path to select the word without help.
The big take-aways: Will understands left-to-right flow of (icon) language and is able to follow it independently. Also, having the icons displayed in the sentence strip allows him to practice and copy words that he would otherwise be unable to find.
Broken down:
1. For Will, use of the app has solidified the concept that text (or icons) read from left-to-right. I don't know whether he initially learned this concept from reading stories at home or from studying the order of the icons in the app, but he gets it. He doesn't hesitate when he sees the 2-button-path to Tyrannosaurus Rex, he knows immediately that the button on the left is pressed first, followed by the one on the right.
2. Will is now, I suspect, fully able to "read" Speak for Yourself. The best idea of a "Oh yeah? Prove it!" experiment that I can come up with would be to print off a sentence of icons without including the text, and see if he can recreate the sentence. My suspicion is that this would be easy for him.
Next, a video (in two clips) of Maya. First, to include her, I asked about Tyrannosaurus Rex and escalator, but she already knew where they were (her vocabulary knowledge of SFY outpaces mine). After that I picked a word that I was 100% certain she had never seen---"Trackball" (I don't know what a Trackball is, it's a pre-programmed word in the app). Then something interesting happened: after she viewed the word, I accidentally erased it, so she no longer had the icon path on screen to follow. This leaves two possibilities for how she found the word: a) she memorized both icons in the sequence, b) she memorized the first icon and scanned the page for a word that had the text features of "trackball." I think she did the latter, since she would easily recognize "ball" and also she has long mastered starting sounds.
The big take-away: Maya uses the icon path to help her navigate towards the target word, then either uses an icon OR reading text to locate the target. (I guess I could see if the latter was correct by showing her a novel target with only the first icon and not the final target icon, and see if she was able to use decoding to find the word.)
Maya is an early reader, and her reading has been loosely assessed as at-or-above grade level. I assume that she also read the icons of SFY and that following the icons made it easier for her to practice words in the app or to copy words modeled by other people (Will is currently doing both of those things). However, learning from the icons appears to not have negatively impacted her ability to attend to text. From what I have seen, she studies the icons in order to locate words, but she also notes the text. For commonly used words she doesn't seem to notice the icons in the sentence strip at all, but if she's in a therapy session where new words are being modeled she will lean over to closely examine the screen of the therapist's iPad, and then she will select the same icons to produce the word on Mini.
Conclusion: Ha! There's certainly nothing to be "concluded" here, from two specific kids with one specific app in one specific home. But I found it interesting to see Will already following the icon language, and to see that Maya still uses it now for new or less frequently used words. Also, when I use a novel word I find myself staring at the icons in the sentence strip, trying to memorize the path to that word. It seems valuable to have icons displayed to facilitate and solidify AAC use/learning.
Disclaimer: As always, I'm not a professional (nor do I play one on the internet). Comments, critiques, links to research, and other thoughts are welcome below or on our FB page!
I think this offers great insight into literacy development. I have been using a literacy app with my kids called Letter Sounds from Reading Doctor. The app uses visual picture cues and gradually fades the pictures out leaving only the word. I think you've just added additional validation to this app. ��
ReplyDeleteI just want to say that you are ABSOLUTELY AMAZING! I'm so glad that I can learn so much from you and share it with other parents that may not have anyone to really talk to about AAC. Thank you for sharing and teaching me so much. I'm beyond excited for Disability Awareness this year. I have so much more to contribute to the language table. We show a picture of an OLD AAC device but Now I can UP the experience for the 4th graders with all this new knowledge.
ReplyDeleteWonderful article. thank you for posting the sort of outstanding weblog! truly inspired through reading your post.
ReplyDeleterefurbished processor