Background Image
Joint Attention

Team Project

Work presented in:
Computational Behavioral Science Expedition 2014
College of Medicine Research Forum 2014
Designing Interactive Systems 2016

Roles:
Interaction Designer
Visual Designer
User study researcher

Collaborator:
John Lee (HCI PhD Candidate)
Hidy Kong (HCI PhD Candidate)
Karrie Karahalios (Project Advisor)

Time:
Fall 2014 - Spring 2015

Paper work accepeted in DIS 2016
EnGaze: Designing Behavior Visualizations with and for Behavioral Scientists

Design Challenge

Behavior is difficult to quantify, especially for young children developing their social skills. Recently, researchers are exploring the use of quantitative computational behavior analysis to supplement qualitative evaluations for understanding child behavior. However, it is difficult to make accurate behavioral models based from data in its current text form, which makes comparisons and analysis difficult. As a designer in the team, my role is to design an interactive visualization which could reveal the unique behavioral patterns of gaze behavior that cannot be observed first hand.

Visualizing joint gaze pattern in early detection test of children's autism.

Project Background

This project is part of a larger research initiative. Our group focuses on techniques for behavioral imaging of social and communicative behaviors and creating graphical visualizations of those datasets. Joint Attention Visualization is based on the behavioral data from annotations of the Rapid Attention Back and Forth Communication Test (RABC) conducted by our collaborators (Emory and Georgia Institute of Technology) for children aged 9–30 months, to collect data about a child’s social and communicative behavior as a pre-screener for Autism Spectrum Disorders (ASD). The RABC is an structured experimental social play protocol between a child and a examiner.

Sketch

Exploring different components of joint attention

The practice of joint attention differs from paper to paper. Therefore I would like to focus on exploring different components of joint attention and how to visualize them to convey the whole picture effectively, and at the same time, leave more room for personalization.

I started with hand drawn designs that visualized the relationship between the child, the examiner, and the two objects involved in the study (the ball and the book). I had some designs that focused solely on the child's gazing behavior pattern, some that tried to show the aggregated view.

Prototype

Prototyping test with real data

Possible solutions were prototyped and tested out with real data sets. We used several sets of extrem data (normal child and child diagnosed with autism) to test if the design of visualization could show the difference at glance. The pattern sometimes is quite different from what we had thought before when it's tested with real data sets. Along with the progress, I started to get a deeper understaning of the general pattern and learnt how to apply it to the design.

Design Version-1

This visualization highlights the quick glances of children that can be interpreted as bids for attention, which would be indication of joint attention. Users can also allow gaze separated by a few milliseconds to be interpreted as joint attention, effectively redefining joint attention while exploring the visualization.

Background Image
Background Image

Design Version-2

Joint attention has been defined in various ways in the past. Thus, rather than presenting one finalized view of joint attention, we provide separate views for different behavioral components of joint attention. In that way, we are able to define joint attention as well as help the users explore gaze to discover the patterns of what they consider as ‘joint attention.’ 

Background Image
Background Image

Feedbacks and Progress

We have presented our work to 11 different clinicians and researchers working with children with ASD. The interactive web tool have allowed them to form their own unique strategies for exploring the visualizations and have allowed them to independently identify children in need of further evaluation. Feedback from users has shown that our visualizations have the potential to be integrated into existing behavioral evaluation processes and aid in detecting developmental delays in young children.