The We Do Listen Foundation has produced some great books that are available both as hardcovers and free animated versions on their website. The stories feature Harold B. Wigglebottom, who often commits a series of social errors and learns through them- thus providing a good context for teaching story grammar. In particular, "Harold B. Wigglebottom Learns to Listen" is a helpful additional context for teaching Whole Body Listening (a term originally created by Suzanne Truesdale and also discussed in Nita Everly's Can You Listen With Your Eyes and Kristen Wilson and Elizabeth Sautter's Whole Body Listening Larry books), and I like that the story provides an opportunity to discuss perspective taking as others notice and are affected by Harold's difficulties in listening. To further explore these concepts, see the work of Michelle Garcia Winner at Social Thinking®. The site's playable animated books and displayable/printable posters are also iPad-friendly.
Tuesday, May 28, 2013
Tuesday, May 21, 2013
Wikiweb Shows Connections Between Topics
While Wikipedia is far from the gold standard of research sources, it does give a good general overview of topics and in my experience is quite well-written. It therefore is a helpful resource for developing reading comprehension, background knowledge and use of strategies for vocabulary and breaking down expository text.
I discovered Wikiweb, which displays articles as semantic web connections between ideas, because it was free at Starbucks, but at $2.99 I think it's still fairly priced. It's got a beautiful look and feel and the added feature of displaying visual connections between topics is potentially very useful for therapy. The articles as displayed can be selected in order to activate Speak Selection, so they can be read aloud as well.
This video from Wikiweb is a bit strange, and makes it seem like you should use the app just so you don't become Claire Danes in Homeland, but it gives you more of an idea how the app works.
I discovered Wikiweb, which displays articles as semantic web connections between ideas, because it was free at Starbucks, but at $2.99 I think it's still fairly priced. It's got a beautiful look and feel and the added feature of displaying visual connections between topics is potentially very useful for therapy. The articles as displayed can be selected in order to activate Speak Selection, so they can be read aloud as well.
This video from Wikiweb is a bit strange, and makes it seem like you should use the app just so you don't become Claire Danes in Homeland, but it gives you more of an idea how the app works.
Thursday, May 16, 2013
Explore Nature with Pepi Tree
Pepi Tree (free Lite version available, full version $1.99) is a fun little app presenting "mini-games" at different levels of a tree, all illustrating in some way the life and "work" of animals in the forest. Mini-games are an interesting concept for therapy, as they can be used as different contexts for concepts, language structure and vocabulary development within the same app. For example, within this app, the caterpillar mini-game can be used to target sequential words (the caterpillar eats a leaf, then goes into a cocoon, and finally becomes a butterfly) as well as some/all (he needs to eat all of the leaf before going into the cocoon). There is also a fox-feeding activity that can be used to address negative words (the foxes don't like all the food) and one emphasizing the curriculum concepts around what plants need to grow. It's definitely worth grabbing the Lite version for your young students and deciding if you'd like all the games in the full version.
Tuesday, May 14, 2013
Enjoy Warmer Weather with Ice Cream Truck
Apps that go beyond a simple screen and involve people around a play space have great potential for interactivity and "speechieness," or using language in the context of the app. Ice Cream Truck ($1.99) is one of those apps- it's somewhat in the vein of Toca Store but it has additional contexts. With Ice Cream Truck, your young students can "drive" the iPad around a space (incorporating augmented reality through the camera) and decide where to park for their customers, using the horn and music to signal it's time to buy ice cream!
There are several modes that interact with each other, almost like spaces within the truck- stack ice cream scoops on one screen, mix yogurt on another, and bring it all to the cash register screen. Overall, a great context for descriptive language, requesting, and all sorts of language structures, as well as building play skills.
There are several modes that interact with each other, almost like spaces within the truck- stack ice cream scoops on one screen, mix yogurt on another, and bring it all to the cash register screen. Overall, a great context for descriptive language, requesting, and all sorts of language structures, as well as building play skills.
Common Core Connection:
CCSS.ELA-Literacy.SL.K.4 Describe familiar people, places, things, and events and, with prompting and support, provide additional detail.
Labels:
apps,
Common Core,
concepts,
description,
early elementary,
life skills,
math,
play skills,
sequencing
Thursday, May 9, 2013
Summer Gigs
Although it is nice to be slowing down a bit as the school year winds down, I am looking forward to some summer events I would like to mention. Hope to see some of you at one of these!
July 12-13, 2013- Long Beach, CA: ASHA Schools Conference- Presenting two sessions, "One Digital Story at a Time: Apps to Target Narrative and Expository Language" and "'Out of the Box': Apps through a Language Lens." I will also be facilitating a roundtable discussion during Friday's event.
Click for More Information
July 22, 2013- Boston MA: EdCamp BLC at Building Learning Communities- I am helping to organize this free unconference in which the agenda is built and executed by participants. Now a veteran of 7 Edcamps, I will say again that I learn more at these events than I do from traditional PD!
Click for More Information and Registration- now open!
8/19/13- Fredricton, New Brunswick, CA: Exploring the iPad for Language-Based Teaching and Interventions-In this day-long workshop, participants will learn "top tech tricks" for utilizing the iPad as a teaching and learning tool, including accessing photo/video production and organizational strategies, accessibility functions, and other native apps. A wide variety of apps in various categories that support students with language difficulties and other learning needs will be demonstrated, along with an evaluation framework for choosing apps for intervention and special education. Attendees will also choose from a selection of free apps to create a project to use with students, and access a variety of information resources to continue learning about technology integration.
Click for Registration
July 12-13, 2013- Long Beach, CA: ASHA Schools Conference- Presenting two sessions, "One Digital Story at a Time: Apps to Target Narrative and Expository Language" and "'Out of the Box': Apps through a Language Lens." I will also be facilitating a roundtable discussion during Friday's event.
Click for More Information
July 22, 2013- Boston MA: EdCamp BLC at Building Learning Communities- I am helping to organize this free unconference in which the agenda is built and executed by participants. Now a veteran of 7 Edcamps, I will say again that I learn more at these events than I do from traditional PD!
Click for More Information and Registration- now open!
8/19/13- Fredricton, New Brunswick, CA: Exploring the iPad for Language-Based Teaching and Interventions-In this day-long workshop, participants will learn "top tech tricks" for utilizing the iPad as a teaching and learning tool, including accessing photo/video production and organizational strategies, accessibility functions, and other native apps. A wide variety of apps in various categories that support students with language difficulties and other learning needs will be demonstrated, along with an evaluation framework for choosing apps for intervention and special education. Attendees will also choose from a selection of free apps to create a project to use with students, and access a variety of information resources to continue learning about technology integration.
Click for Registration
Tuesday, May 7, 2013
Celebrate Speech with a "Silent Film"
Another way to promote the awareness of the importance of speech during Better Speech and Hearing Month (or other times) is to explore the idea of silent film. Many kids are unaware that there ever were films that have no sound, and any language-neutral visual can be a great context for having kids generate language. I found this treasure, "Unspoken Content: Silent Film in the ESL Classroom" just in a quick search about this topic. The article describes how "The Painted Lady," available, like many films, on YouTube (or using PlayTube to cache if YouTube is blocked), can be used to target narrative and metalinguistic awareness.
I mention all this primarily because Google has recently unveiled a cool new resource: The Peanut Gallery. This website (you cannot access this on iPad, and it only works in Google Chrome) allows you to dictate language that will appear as "silent film" titles over any of a selection of over 12 old movie clips. The site uses Google's "Web Speech API" and is remarkably accurate. Just speak, and it will convert your speech to text within titles over the movie clip, which is then saved and shareable.
I mention all this primarily because Google has recently unveiled a cool new resource: The Peanut Gallery. This website (you cannot access this on iPad, and it only works in Google Chrome) allows you to dictate language that will appear as "silent film" titles over any of a selection of over 12 old movie clips. The site uses Google's "Web Speech API" and is remarkably accurate. Just speak, and it will convert your speech to text within titles over the movie clip, which is then saved and shareable.
You can see one of my attempts at it here.
The Language Lens on this site, then, is that it provides you with many contexts to have students analyze situations (characters, settings, ongoing events) and generate narration related to this, which employs the interpretation of body language and emotions, as well as, potentially, metalinguistics such as sarcasm and understatement.
You will need to insert test dialogue (e.g. "Action" or "Oh no!") just to make the film proceed at first, so that kids get the context and can plan dialogue for a second or third try (or more), as improvising may be too difficult without your scaffolding.
Common Core Connection
CCSS.ELA-Literacy.RL.4.2 Determine a theme of a story, drama, or poem from details in the text; summarize the text.
Wednesday, May 1, 2013
Open Better Speech and Hearing Month with Sound Uncovered
Sound Uncovered is a gorgeous free app for iPad that you can use as a context to introduce Better Speech and Hearing Month to your students. The app explores the science of sound through a variety of interactives that you can use to bridge discussions about hearing conservation. In addition, the app itself provides a lot of content that you can use for mapping expository text, as it presents much information about how sound works. This is also a key curriculum area in "energy, light and sound" type units of science.
A sampling of the activities:
Which Car Would You Buy- Presents sounds produced by cars and car parts and links these to purchasing decisions.
What's Making This Sound? and Sounds Like?- Have you listen to sounds or people's descriptions of a sound in order to guess what they are talking about (inferences!).
Eyes vs. Ears- talk about "listening with your eyes" while exploring how visual input helps us understand sounds .
Stop Me If You've Heard This One- demonstrates that you can't talk and listen at the same time! I have a lot of students that can benefit from that one...
Ultimately, Sound Uncovered is probably best for older (upper elementary, MS/HS) and high-functioning students, but the interactives could be adapted for young students. Exploratorium, the interactive museum of science, art and perception in San Francisco, offers a similar open-ended app called Color Uncovered, which also looks to be a good context for eliciting language and description.
Common Core Connection:
CCSS.ELA-Literacy.SL.6.1d Review the key ideas expressed and demonstrate understanding of multiple perspectives through reflection and paraphrasing.
A sampling of the activities:
Which Car Would You Buy- Presents sounds produced by cars and car parts and links these to purchasing decisions.
What's Making This Sound? and Sounds Like?- Have you listen to sounds or people's descriptions of a sound in order to guess what they are talking about (inferences!).
Eyes vs. Ears- talk about "listening with your eyes" while exploring how visual input helps us understand sounds .
Stop Me If You've Heard This One- demonstrates that you can't talk and listen at the same time! I have a lot of students that can benefit from that one...
That's where I heard it, and the age is right!
Ultimately, Sound Uncovered is probably best for older (upper elementary, MS/HS) and high-functioning students, but the interactives could be adapted for young students. Exploratorium, the interactive museum of science, art and perception in San Francisco, offers a similar open-ended app called Color Uncovered, which also looks to be a good context for eliciting language and description.
Common Core Connection:
CCSS.ELA-Literacy.SL.6.1d Review the key ideas expressed and demonstrate understanding of multiple perspectives through reflection and paraphrasing.
Monday, April 29, 2013
Skitch it up!
Skitch has long been a favorite of mine from the Evernote suite of tools. A super-easy, cross-platform, free and powerful photo annotation tool, it was originally developed by the makers of Comic Life, and you'll see why this makes sense.
So much of language involves breaking a whole into its parts, labeling, or describing, and Skitch gives you an engaging and beautiful way to do this with students. Skitch allows you to build a diagram or graphic organizer from scratch (its tools include arrows, shapes, text, highlights or drawn lines) or apply all of these annotations to a snapped or saved photo.
So much of language involves breaking a whole into its parts, labeling, or describing, and Skitch gives you an engaging and beautiful way to do this with students. Skitch allows you to build a diagram or graphic organizer from scratch (its tools include arrows, shapes, text, highlights or drawn lines) or apply all of these annotations to a snapped or saved photo.
Skitch also allows you to annotate a snap of a website or map, as I have done above, and recently added the ability to annotate PDFs, though this is a premium feature.
For another example, see how a client of mine (in his own way) analyzed a confusing-to-navigate Brugger's Bagel location using Sarah Ward's executive functioning concept of "zones" (as part of the Space/Time/Objects/People "STOP" strategy, and not to be confused with Leah Kuyper's equally awesome Zones of Regulation).
"Grabbage" being the self-serve area.
Skitch naturally integrates with Evernote to save your work, but you can use the app without an account and save to the Camera Roll, you just won't be able to edit it later.
Think of Skitch as a context for:
-Describing facial expressions on snapped pictures within a group or with role-play.
-Integrating with picture books and creating a diagram from concept-rich pages that you snapshot with the Camera.
-Saving and describing images related to the curriculum.
-Young and old...check out Kindergarten teacher Matt B. Gomez's post about how he uses Skitch with his youngsters.
What else? Share your ideas in the comments!
Wednesday, April 24, 2013
Penultimate: a Visual Take on Notes
All students, but particularly ours, benefit from seeing visual examples and connections between ideas.
This is the main function of Penultimate, another free tool from the folks at Evernote. Penultimate allows you to create sketch-based journals that then sync with your Evernote account.
You can write on the page (I recommend having a cheap stylus available for this- get a Targus or iHome one from Amazon, Marshall's or TJ MAXX) or insert images:
The AMAZING thing about this is that your printed text is then searchable in Evernote!
Nuts!
So, grab a stylus and get visual! Here are my thoughts on Penultimate:
What are your thoughts?
Monday, April 22, 2013
Take a "Peek!"
A good chunk of our work as SLPs is based on questioning students in some manner. What does that word mean? What are the parts of a ____? What goes with a ____? What are some examples of ____?
Not to mention all the how or why questions.
It's definitely a plus when we can up the engagement and fun on these types of interactions with students.
In my last post, I started looking at Evernote through that eponymous app, and there are a number of ancillary Evernote apps that work with the overall ecosystem. Today, let's look at Evernote Peek, an app that allows you to make fun little quizzes with an ingenious use of the iPad.
Peek gained notoriety as the first (and I think since then, only) app that interacted with the iPad Smart Cover.
Better shown than described. In short, Peek lets you set up word or question lists, along with definitions or answers, and then interact with them using a Smart Cover. Now, don't stop reading if you don't have a Smart Cover, because the app features a "virtual" one!
Here's how you do it:
First, of course, download Evernote Peek and Evernote, and create an account in Evernote. In Evernote, create a notebook for your "Peek set."
Once you have created your notes, open Evernote Peek. Tap the Plus to add your notebook, then My Notebooks, and select the notebook in Evernote that you created.
Select the notebook from the Peek menu and start your quiz! Here you can see what the virtual Smart Cover looks like...
A couple of notes.
-Some of the other content you can add to Peek is worth exploring. In particular, the Test Prep notebooks in the Education section provide good models of mnemonics for vocabulary words.
-My Smart Cover no longer interacts with Peek. I have no idea why. You can turn on the virtual Smart Covers in Peek Settings.
-Peek is a great app to use in consultation- teach students how to make Peek sets. Mine thought it was super cool!
How do you see yourself using Peek?
Not to mention all the how or why questions.
It's definitely a plus when we can up the engagement and fun on these types of interactions with students.
In my last post, I started looking at Evernote through that eponymous app, and there are a number of ancillary Evernote apps that work with the overall ecosystem. Today, let's look at Evernote Peek, an app that allows you to make fun little quizzes with an ingenious use of the iPad.
Peek gained notoriety as the first (and I think since then, only) app that interacted with the iPad Smart Cover.
Here's how you do it:
First, of course, download Evernote Peek and Evernote, and create an account in Evernote. In Evernote, create a notebook for your "Peek set."
Each item in your set needs to have a note within that notebook. Create notes in which the trigger word (a vocabulary word or question) is in the title of the note. This is what will appear under the first flap as shown above. Write the definition or answer in the body of the note. This will appear as you continue to lift up the cover.
Once you have created your notes, open Evernote Peek. Tap the Plus to add your notebook, then My Notebooks, and select the notebook in Evernote that you created.
Select the notebook from the Peek menu and start your quiz! Here you can see what the virtual Smart Cover looks like...
A couple of notes.
-Some of the other content you can add to Peek is worth exploring. In particular, the Test Prep notebooks in the Education section provide good models of mnemonics for vocabulary words.
-My Smart Cover no longer interacts with Peek. I have no idea why. You can turn on the virtual Smart Covers in Peek Settings.
-Peek is a great app to use in consultation- teach students how to make Peek sets. Mine thought it was super cool!
How do you see yourself using Peek?
Subscribe to:
Posts (Atom)