Individual Submission Summary
Share...

Direct link:

Getting To Know Your Data: Visualization and Quantification of Temporal Dynamics of Interaction

Thu, March 21, 9:30 to 11:00am, Hilton Baltimore, Floor: Level 2, Key 4

Integrative Statement

Learners’ real-time interactions in a dynamic ecosystem shape their outcomes across domains. These interactions are complex and notoriously difficult to study (Baumeister, Vohs, & Funder, 2007): behavior unfolds across many modalities or dimensions: from physiology and affect, to gaze and embodied activity– and across many timescales: from seconds (e.g. a contingent gaze shift) to minutes and hours (e.g., an argument) to years (e.g., a healthy relationship).

New technology has dramatically increased our ability to collect multimodal data at a fine temporal scale both in and out of the laboratory. These data have historically included annotations of video or audio recordings, and increasingly include markers of behavior automatically detected from wearable sensors such as the LENA © and others. Leveraging these massive new datasets to characterize the complex processes of development is an outstanding challenge for research in developmental science. Theories are often underspecified as to the exact nature of unfolding interactions, and there are few if any “off the shelf” analytical tools that can characterize the dense, multi-modal dynamics of development using emerging datasets (Gnisci, Bakeman, & Quera, 2008).

This presentation will focus on the use of visualizations as a key tool in the process of making sense of and analyzing high-density datasets of early social interactions. We will illustrate the use of visualizations across the research process. Early on, visualization are critical to gain insight into the structure of interaction data suggest the most relevant quantitative techniques for further analyses. Later, visualization can ensure the validity and quality of operationalizations of phenomena of interest and help to interpret observed results. Rather than focusing on a single computational technique or statistical method, this presentation will highlight the potential to combine insights gleaned from visualizations with theory to adapt and combine quantitative techniques and effectively characterize various aspects of interactions. These points will be illustrated with examples drawn from diverse datasets including multimodal video annotations of free-flowing social interaction as well as continuous physiology and gaze data from a controlled lab setting.

Author