Pages

Thursday, March 28, 2013

mARch: A Visit to the App Store and the Moon

mARch has been a lot of fun! Hopefully it has opened up an awareness of augmented reality and what's available through a handful of simple (or not quite entirely simple, in Aurasma's case), cheap apps.

A lesson I will share, however: searching the app store for augmented reality apps produced 100s of results, 90% of which were less than useful. Many of these apps are free but serve a singular purpose: marketing. An app that makes a Happy Meal box or poster at Walmart come alive is not such a great target for us, and is another example of why sometimes a free app is not worth it.

I wanted to point you also in the direction of fellow enthusiast Jeremy Legaspi, who at The Speech Guy covered some great AR apps such as The Amazing Spider Man AR, Zooburst, Kids Vehicles 1: Interactive Fire Truck, and had his own take on Aurasma.

My two last examples for the month take us into space! Spacecraft 3D (free) can harness kids' interest in space vehicles to target descriptive language. Using this app from NASA and a printed marker, you can view a wide variety of vehicles, including satellites, and their animations. This app is somewhat poignant as NASA's terrific efforts in producing educational technology have just been halted by the budget sequestration.


Also check out MoonWalking ($.99). This app is "positional" AR, similar to TourWrist, except animations of the moon landing are embedded in the view. As the app description says: 
You can walk around the Lunar Module as it lands. Step up to examine the flag. Take a picture of friends and family posing with Armstrong as he places his boot on the moon. At any moment, toggle between the augmented-reality view and the virtual moonscape.

Great sequencing and storytelling possibilities with this one!


Tuesday, March 26, 2013

mARch: Out of Bubbles? Try BubblesAR

Bubbles have long been a go-to speech and language therapy tool. They're great, I am not suggesting you stop using them to target /b/ and /p/ sounds, "wet," etc.

But, if you run out of bubbles, or want to try something new to engage kids in a joint attention activity, BubblesAR (free) is simple and cool. Tap the circle to blow a bubble in your "room" (the camera is activated). Tap longer and the bubble will be bigger. Whatever the circle is positioned over will be "reflected" in the bubble as it floats, giving you the opportunity to have the child name items in the room.  Tap the bubble to "pop" it for a cool confetti effect.  That's it!


Monday, March 25, 2013

mARch: Create Rooms with SnapShop

SnapShop Showroom (free) is a fun little app that allows you to create and save rooms using furniture from many well-known realtors such as IKEA and CB2.  You can use their limited background library, a saved picture from the web (as I did below) or a picture from your camera- the fact that you can snap your current surroundings and then put furniture in it is what makes this app augmented reality:


This video shows an earlier version of the app in action. The above shot shows it works great on iPad as well.

Language Lens:
This would be a nice app to use with older students or adults as well as young ones. The category of furniture can first of all be developed with this app, as well as descriptive attributes such as size, shape, color and function. Additionally, as the app is essentially for shopping (don't worry, you won't accidentally purchase a sofa during a therapy session) pricing information is provided, which would be a great context for a life skills activity around budgeting.

Tuesday, March 19, 2013

mARch: Build Categorization Skills with Adorable AR Animal Flashcards

AR Flashcards-Animal Alphabet ($.99, iPhone/iPad) is another marker-based fun AR visual. After downloading the app, navigate to the website provided to download the animal alphabet flashcards at no additional charge. When printed, they show a 3D animal when viewed within the app.


The animals provided here could be used to target letter-sound relationships or categories- sort by farm, pet, forest animal or whatever! As a followup, you can make other alphabet cards for other categories using Aurasma!

Common Core Connection:
CCSS.ELA-Literacy.L.1.5a Sort words into categories (e.g., colors, clothing) to gain a sense of the concepts the categories represent.

Monday, March 18, 2013

mARch: Play with Plants in Cactus AR

Cactus AR is a simple, free app for iPhone and iPad (camera required) that provides you with a context to engage students in "taking care of" an augmented reality plant. After downloading the free app, you can email yourself and print a "marker," a piece of paper that functions like a QR Code. Many AR technologies function this way, by scanning a marker that brings up an image- it's very cool and will give you a "wow!" factor with your students!

In addition to providing an animation (a funky-cute little cactus), Cactus AR is interactive! Each day you can check in on your cactus and give it a water mist and fertilizer- just don't overdo it! A meter shows you how much to provide:


Ultimately, Cactus AR is a simple use of augmented reality that replaces the messy process of having kids take care of real plants. In either case, they have an interactive and visual experience that reinforces the categorical (sun, plants, water), causal and quantitative concepts around plant life, a key aspect of the science curriculum. A good, though small example of FIVES- Fairly Priced, Interactive, Visual, Educationally Relevant, and Speechie.

Common Core Connection:
CCSS.ELA-Literacy.L.1.1g Use frequently occurring conjunctions (e.g., and, but, or, so, because).

Wednesday, March 13, 2013

mARch: Take Virtual Field Trips with Tour Wrist

Moving into other applications of augmented reality (AR) in interventions, today I am going to talk about Tour Wrist (Free for iPhone, iPod, iPad). TourWrist is one of the most dazzling, yet simple to use AR apps, transporting you to geographic locations where professional photographers, businesses, and your average Joe have recorded and uploaded "tours."  Once you access a tour, you interact with it by moving the iPad right, left, up and down, and turning around as your viewpoint changes, giving you in many cases a 360ยบ view of a place. The tours are naturally still photos, and are not in real time (probably a good thing), but are nonetheless very cool kid-pleasers.

This is therefore a different use of AR than what we saw with Aurasma- instead of scanning a visual material or marker to view digital information, the gyroscope in the device layers a different viewpoint according to your position, making your reality "augmented." Note that this app does work on iPad 1, but that you have to tap/drag to change the position and viewpoint, so it's not quite AR on that device.

A tip about tours: as this is essentially a marketing app, there are views of less salient stuff such as the inside of hotels, etc.  Use the menu to navigate to Points of Interest or Featured locations for better results.  From the displayed map, you can also view the label of the tour, which gives you an idea of the content. Just hit the arrow button to access the tour.


Applying the Language Lens to this app:
-Take virtual field trips to a particular location and elicit descriptive language about the viewpoint.
-Use a setting map/graphic organizer to build knowledge of story grammar elements.
-Align with classroom curriculum by accessing tours that relate to content in the classroom (50 states, landmarks, etc)
-Take a screenshot and use as a stimulus for "I was there..." writing.

And the Common Core Connection (note, this is a great app for older students):
CCSS.ELA-Literacy.SL.9-10.5 Make strategic use of digital media (e.g., textual, graphical, audio, visual, and interactive elements) in presentations to enhance understanding of findings, reasoning, and evidence and to add interest.


Tuesday, March 12, 2013

mARch: Augmenting with Aurasma, Part 3- Using Text and Sharing your Work.

The last several posts here focused on using the Aurasma app to "augment," or layer discoverable visual information, over an image, specifically a book page. These same steps can be used to augment other materials- flash cards, posters, bulletin boards, printed images or student-created art. Part one showed how to use Aurasma's library of images and animations, and part 2 gave steps for using your own images and video as "auras."

In my previous series, I showed how QR codes could be used to display text for language stimulation.  This can be done with Aurasma, as well. However, while you can easily generate a QR code that displays text (I need to update this as I now think other QR generators are easier to use than Kaywa), Aurasma is image-based. So, you have to make your text into an image!  This is easy enough, as you can use a drawing app to write text and save that as an image to the camera roll, or use another app and take a screenshot of the text.

Here's how you do it:


1. Use a drawing app such as Doodle Buddy to write single words to be displayed as images. For example, you can use a conjunction such as "after" to promote complex sentence formulation in context. You could also use vocabulary words. Doodle Buddy lets you save the image, but if you want to write longer text, you could just use an app such as Notes, and take a screenshot.


2. Follow the steps in previous posts to make the text an aura.

As stated in the opening posts, when you make an aura it is available in that version of Aurasma, on that device.  Auras can be shared between devices by emailing them as a link, however. These steps are a little complicated and were made more so in the newer version of Aurasma, but I thought I would share them anyway:



You would want to keep auras private on your own password-protected device, rather than sharing, if they contain images and video of students.

That's it for Aurasma! I look forward to sharing a few other apps this month to show you how augmented reality can be useful in your practice, but first, a Common Core Connection related to this post: 
CCSS.ELA-Literacy.SL.3.6 Speak in complete sentences when appropriate to task and situation in order to provide requested detail or clarification.

Monday, March 11, 2013

Upcoming Presentations

Hi Folks-

Just wanted to give you an idea of where I will be the next few months:

March 23, 2013- EdCamp Access, organizing and facilitating a session or two. If you are in the Boston area and can come to this free unconference dedicated to ideas and issues in special education, please do. Click through to find out more about the EdCamp model and to register. I can attest that I learn a TON from innovative PD events such as EdCamps.

March 29, 2013- FREE Webinar for Advance for SLPs and Audiologists. "Pairing Picture Books with Apps to Contextually Address Language Objectives." This is a repeat of my ASHA 2012 session which was a lot of fun!

April 5-6, 2013- Indiana Speech-Language Hearing Association Convention- four sessions covering a variety of topics.

April 12, 2013- Manitoba Speech and Hearing Association Annual Conference-“'Out of the Box' iPad for SLPs- Apps Through a Language Lens!"

April 24, 2013- Social Thinking® Providers' Conference, Chicago, IL- "Social APPtivities- Adapting iPad Apps for Social Thinking"

July 13-14, 2013- ASHA Schools Conference- So excited to be presenting two sessions in Long Beach, CA! "One Digital Story at a Time: Apps to Target Narrative and Expository Language" and "'Out of the Box': Apps through a Language Lens"

Hope to see you around!

Thursday, March 7, 2013

mARch: Augmenting with Aurasma, Part 2- Making it Your Own!

In the last post, we looked at how to use Aurasma's own library of images, animations and 3D models to create an "aura"- an image overlay that appears when you scan a visual material (usually, another image).

Like any great "Speechie" app, Aurasma allows you to use your own images or even videos as auras. As always, this equals limitless contexts for applying the app!

In this post, we will look at how to use materials from your photos app (aka camera roll and photo album) in the context of augmenting a visual material such as a book. It will be important that you have read Part 1, as I am not going to go through each step. I will just be saying how it is different to create an aura from your own photos or videos.


If you are not sure how to do this step, see this post about Saving Images to iPad.


OR, another option is to create your own images or video using the camera. If you want to augment a material with kids' own drawing or writing, shoot a picture of it!


Note that, as stated, an extra step is involved when using your own images or video- you have to name the image/video file (the overlay), and the aura (I usually keep it the same name). Also, if using a photo or video of a child, keep the file private, not public, when you create the aura (see last post's step 6)


The rest of the steps work the same as in Part 1!

If you are creating a video aura of speaking about a book connection, as I modeled above, a Common Core Connection for you:
CCSS.ELA-Literacy.SL.4.4 Report on a topic or text, tell a story, or recount an experience in an organized manner, using appropriate facts and relevant, descriptive details to support main ideas or themes; speak clearly at an understandable pace.

Next post, in wrapping up this look at Aurasma, we'll be looking at how to add an aura that displays text (since auras are images, can you guess how?) and how to share auras to other devices.

Tuesday, March 5, 2013

mARch: Augmenting with Aurasma, Part 1.

In yesterday's post, I introduced the topic of augmented reality (AR), which can be used to add visuals such as animation, images, video and text to many contexts. We'll be looking at some stand-alone apps that do their specific AR thing, but I wanted to start with Aurasma, the app that was featured in yesterday's video.  This video showed Aurasma applied in schools to give kids a way to link, say, a bulletin board to related images and video.  Aurasma is actually pretty easy to use!

Note: Aurasma is available for free for iPhone, iPod and iPad 2 and above. This is because a camera is essential to the function of this app and many of the others this month. My apologies to readers with an iPad 1. This app is also available for Android, but I can't attest to how it works.

So, first, a context. Let's say you read a picture book with your kids, which I hope you do occasionally because there are so many skills you can build around picture books. What if you could then (after reading) make the picture book an interactive experience with the students, allowing them to scan the book to view, discuss, and respond to images, text, or video associated with the book? What if they could record videos and make these other visuals pop up themselves? They can.

In this post, we are first going to see the steps of creating just one "aura" with the picture book The Big Orange Splot, by Daniel Manus Pinkwater. The steps flow really easily once you see how it works. The Big Orange Splot is a great story about individuality. A man lives on a "neat street" where everything is the same. One day, a seagull drops a can of orange paint on his roof. Instead of just cleaning it off, he allows it to inspire him to make all kinds of interesting changes to his house. His neighbors are at first outraged, then experience the same inspiration.

In this series of steps, you will see how to make an image aura from Aurasma's own library of images "float" above a book page when the page is scanned. Specifically, through these steps a seagull "aura" is accessed when the book's cover is scanned. What could you do with that? It sure is a fun way to prompt retelling and understanding of an initiating event in a narrative. You can think of doing the same type of thing for another visual material, such as a printed or drawn picture, a poster, a flashcard, etc...

When you open Aurasma the camera will be activated:









Give it a try! These 8 steps seem like a lot at first, but you'll see they are a quick, logical series after practicing a few times. 

See my other posts detailing other features of Aurasma:

Oh, and by the way, there's a Common Core Connection:
CCSS.ELA-Literacy.SL.2.2 Recount or describe key ideas or details from a text read aloud or information presented orally or through other media.

Monday, March 4, 2013

mARch- What is Augmented Reality (AR)?

I have been wanting for some time to dedicate some posts to augmented reality (AR) and its potential in speech-langauge and other interventions. As you can tell, I also wanted to wait until a month that had "AR" in it, and March is my last opportunity until next JanuARy, so here I go.

First of all, the phrase "augmented reality" is probably quite complicated-sounding and scary, so I will assure you that I will try to make it simple and relevant to your work. We may remember "Virtual Reality" as a similar-sounding technology, in which a person could navigate a digital space, often with the use of masks or other complex gear, like in this silly scene from Disclosure.

So, what we will be talking about is not that. Much simpler.

Essentially, augmented reality is any technology that displays digital information in our real world, usually through a laptop or smartphone camera and with the help of "tags" (i.e. the person walks to a place that has a location-marked geotag, and information is displayed) or markers, kind of like a QR code.  QR and AR are quite related, so if you never saw my series on the usefulness of that technology,  you might want to check it out, as it is likely to give you ideas about how to use AR as well.

AR has been in the news a lot recently because of Google's "Glass" project. Google Glass is a new mobile device being piloted that functions like a smartphone to give you all kinds of information through a set of glasses (email subscribers, be sure to click through to the post to see the 2 video examples):




Believe me, I am not rushing to embrace this, either. I feel like I have enough information in my daily life without flooding my vision with it.  Others have mentioned the potential pitfalls of this technology. Plus, you look like a tool wearing those things.

And that is not what we are talking about either. Instead, this month will cover a few simple apps that you can use on the iPad in order to provide context, visual support, and engagement for our students. For a great example of what is really easily done with this technology in an educational (or clinical setting), I am going to share one other video. This one has to do with the free and easy-to-use Aurasma (for iPod, iPhone, iPad 2 and up, and Android) app that will be the focus of the next few posts.



So, it's food for thought, isn't it? What can kids do, with your support, with this extra "layer" of information in their environment? More coming soon in mARch.