Look Meta-Tool

Copyright ©Tatar 2006

In our handheld math, science and architecture projects, we have been building coordinated activities for distributed learning. One challenge that has arisen in this is allowing people to see what others are looking at. Handhelds are small and have a lot of glare. Sometimes people show others what they are doing. For example, they may hand their handheld to someone else. But this assumes that the participants are already involved in focused interaction. Furthermore, they have to be extremely close to one another to continue to work together.

My student, Kibum Kim, and I have been building different versions of a meta-tool "Look", designed to allow peripheral participants (latecomers and overhearers) entry into conversations about work or other coordinated activity already happening on the handhelds.

This is a project that at once pushes the technical limitations of the handhelds and the limitations of our social science knowledge. The question is whether the aspects of social or technical design that we control can overcome the intrinsic limitations of having different small machines or particular technical limitations of standardized machine protocols to allow sufficient sharing for human interaction/learning purposes.

The fact that people rarely complain explicitly if they cannot see what others are doing will not stop them from "voting with their feet" about the utility of the technology.

We currently have two papers on this topic:

Kim, Kibum, Tatar, D. and Harrison, S. (November, 2006) Handheld-Mediated Communication to Support the Effective Sharing of Meaning in Joint Activity. WMUTE 2006, The 4th IEEE International Conference on Advanced Learning Technologies, November 17-18. Athens, Greece.

Kim, Kibum and Tatar, D. (2005) Weak Guidance with "Look" Functionality in Handheld-based Classroom Activities. Proceedings of the 2005 Conference on Computer-Supported Collaborative Learning. Taipei, Taiwan, May 29-June 4, 2005.

Tatar Research Page