What is Say Say Oh Playmate?

Say Say Oh Playmate incorporates kids' interests in music and proven literacy instruction techniques to teach word recognition skills in an innovative and fun way. This cross-platform program builds off of students' prior knowledge, lyrics from popular songs, to motivate reading and to support beginning literacy skills.

Reading tasks are situated in the familiar context of playground clap-routines. With the help of a friendly character, a student's goal is to teach neighborhood girls how to sing and clap traditional clap routines. Our research has shown that young girls are especially motivated by this task and improve their reading skills when they use the software.


Screen shots and descriptions
Instructional strategies
Evaluation results
Download software

Is the software architecture scalable?

Underneath the Say Say Oh Playmate software is generalizable architecture which could be used to create different instructional contexts with identitcal instructional methods. For example, instead of using knowledge of playground songs, another possible reading program could use knowledge of Disney songs. A storyline could be crafted in which a group of Disney characters prepare to enter a sing-a-long contest and they need to learn some new songs and the student's task would be to help them. Imagine a very interesting situation where Ariel from the Little Mermaid wants to learn the song Hakuna Matata from the Lion King!

Could we make a series of similar reading programs?

The vision for multiple versions of this software is entirely possible. The architecture on which Say Say Oh Playmate is based is ready to be used as a template through the Lyric Reader authoring program. It allows a designer to create his own scenario, using a particular type of music and a new characters, on top of a sophisticated engine that will implement the instructional techniques.

Read more about Lyric Reader | See more examples of Lyric Reader scenarios



This material is based upon work supported by the National Science Foundation under Grant No. 9984429.

Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.