Tuesday, October 8, 2013

Motor Planning & Language Learning (with video)

If you're just here because you like to see Maya, stick with me through the first few (more informative than usual) paragraphs. There's a cute new video at the end, and Maya's having a great time telling jokes about being sick.


Prior to the spring of 2009, I had never heard of motor planning. That spring ushered Maya into Early Intervention and introduced us to physical therapy (well, we were introduced to all-of-the-therapies, but physical is the one relevant to this story). Our therapist taught me about motor planning, which is basically the way that you execute any movement---your brain knows/figures out what to do, the message is sent to your muscles, your muscles execute the plan, and success! If you're not sure what I'm talking about, try this---touch your nose. If you just touched your nose, that means that your brain sent a message to the muscles in your arm and hand, telling you to lift your arm, bend at the elbow, curl all of your fingers except for one, move towards your face (at a speed that is fast but not too fast) and touch (but not hit or stab or tickle of pick) your nose.

This probably seems like nothing (and let's not talk about how it pains my science teacher self to generalize down to "your brain sent a message"), but it's a big deal. If you've worked with people with disabilities, or stroke victims, you would know that when something is amiss with a person's ability to motor plan, life becomes very difficult. Motor planning issues are the reason that Maya still can't step up onto a curb without holding a hand (or bending over and holding the curb, which she does if she gets impatient waiting for a hand)---it's too complicated to balance, lift a foot, balance, lean forward, put the first foot down, lift the other foot, balance, put the second foot down next to the first foot, and do it all without falling over. 

When I began searching for communication apps and/or devices for Maya, I was surprised to find that none of them took the role of motor planning seriously (except, notably, for PRC devices---but their small device was too small for us, and the language system in the larger device was still a bit too big for then-3-year-old Maya). The big communication apps on the market had words that moved around. Let's say that I started Maya off with 9 words, with a screen like this:


When she was ready to move to 12 words, everything moved. The buttons would shrink to allow more words in the screen, and she had to relearn it all. Imagine what would happen if we jumped from 9 words to 30 words---everything moves to new places. 

It doesn't make sense.  No child should have to re-learn how to say a word, ever. Once they have the motor plan to tap-tap and produce a word, the sequence should always be the same.

You use buttons (in the form of a computer keyboard) everyday . . . and you have the motor plan to access the letters quickly. What would happen if, every night when you went to bed, all of the keys on your keyboard hopped around and changed position?  

Not cool, keyboard. Not cool at all.


Suddenly using the computer would be overly complicated, because you would be learning a new motor plan, instead of letting the ones that you have already learned run automatically. 

When we found Speak for Yourself, the communication app that Maya uses, the incorporation of motor planning into their app was groundbreaking.* We started using the app with only 6 or 7 words "turned on," but as we added new words the orginal 6/7 never moved. The fixed positioning of the words meant that she never had to relearn anything---once she knew how to say "milk" she would always know how to say "milk." 

Look where the HELP icon is, in the top right-ish area.


Now look how HELP is in the same spot, even with all 119 words lit up.


This is kind of an abstract concept, until you get a chance to see it in action. About a month ago I was asking Maya questions about the previous week (she had just gotten over a nasty virus) and recorded this.  I thought the word "medicine" was turned on (lit up) but it wasn't---watch as her finger traces perfectly along the empty row where the button for medicine should be, before she lights up all of the vocabulary and hits the button. (With the keyguard is on you can really see that she is tracing the exact horizontal row where "medicine" is located.)





See? Motor planning! The same way your fingers know how to type your name without looking, Maya's fingers know how to say medicine with two (very specifically placed) taps. 


In April we had Maya's routine annual eye check-up. The ophthalmologist, who is very nice, was intrigued by the talker and asked several questions about it after the eye exam (which included dilating the pupils) was completed. Maya used the talker to answer a few questions, about friends at school and such. The doctor turned to me and said "You know that she can't see those icons with her pupils dilated, right? She's doing that all from memory."

Not memory, doc. Motor planning. 





*The "LAMP: Words for Life" app, which came on the market last year, also works in accordance with motor planning principles. According to some AAC folks, you can also heavily reprogram some other communication apps to get them to work in accordance with motor planning principles, but that's a little out of my league yet :)  


5 comments:

Run Amy Run said...

This was, by far, the best explanation of motor planning! Fantastic post!

Anonymous said...

Great post. I never thought about that...ok so now what do i do with my Go Talk now app....it does exactly what you said moves the words when you add more. maybe start with a full screen and leave spots blank...then I think one would not have to do any reprograming.......

great post

PaytonC said...

I love her sense of humor!

Beth said...

what I noticed also is some of those words are getting really clear now! I know that wasn't the point of the post, but it was cool nonetheless.

Alana said...

This makes so much sense.