Occasionally one comes across a new bit of kit or a new technique which looks immediately enticing and exciting. This is certainly one: Saul Albert reports on his recent Drawing Interactions project that aims to create new graphical techniques and tools for the transcription, analysis and presentation of interaction research.
Conversation analysts usually show their work using Jeffersonian transcripts with traced outlines or video stills in a ‘film strip’ style. These kinds of graphical transcripts present research for finished publications. But what about the exploratory phases of interaction research such as transcription and collaborative data sessions?
The drawing interactions project set out to develop tools and methods for the transcription and analysis of interaction based on artistic drawing techniques and video studies of analysts’ work practices. The project emerged from conversations between Saul Albert, Toby Harris, Pat Healey, Claude Heath, and Sophie Skach after The Fine Art of Conversation workshop, which explored artistic methods for depicting interaction. After seeing how much researchers enjoyed drawing their data, the team planned a hack-session to build software tools to support graphical approaches to transcription.
Developing the prototype
We studied video of analysts’ work practices from the Learning How to Look and Listen project, including a data session by the much-missed Charles Goodwin with John Haviland. Chuck and John often use gestural reenactments alongside descriptions, and they repeatedly mention needing tools to transcribe bodily actions (you can see the kind of thing I mean here). That was a great help in showing us what we wanted the drawing application to be able to do. Based on our analysis of many such moments in Chuck and John’s discussion, we designed three main features into our prototype: we wanted everyone to be able draw on the data; to draw on a paused bit; and to highlight a moving bit.
In my illustrative guide below, I’m drawing onto a clip of a group of designers sitting round a table and discussing a set of plans.
1. Letting everyone have a go
We noticed that whoever controls the laptop tends to direct analytic attention to specific features of the interaction. Others struggle to participate as much. This asymmetry is normal when the ‘owner’ of the data directs the data session towards specific issues; but it’s not always necessary or desirable. So we designed an interface to provide multiple analysts with access to the timeline using a movable ‘film-strip’ style overlay, which analysts could tap to pause, scrub, or review the video.
2. Drawing onto the paused timeline
This way you can trace over the video to highlight specific aspects of embodiment. Drawing onto the paused video creates a mark on the little ‘film strip’, which can be touched to return to that moment in the video.
3. Drawing highlights onto the moving timeline
We adapted the ‘telestrator’ feature from TV sports to capture analysts’ own drawn gestures. Moving the timeline while drawing with the stylus creates a moving ‘spotlight’, highlighting a particular feature which you can play back and forth as much as you like.
We showcased the prototype at the New Developments in Ethnomethodology workshop organized by Michael Mair at Liverpool University’s London campus.
Sophie Skach explained figure drawing methods such as using lines and circles capture postural configurations, bodily volumes and gestural dynamics with a few quick pencil marks. She showed us how to use rapid sketching to do quick and dirty ‘field notes’ rather than transcriptions as such. This technique uses our embodied knowledge of anatomy and motion to infer how bodies are working, what they are doing, and what they might (projectably) do next. Claude Heath demonstrated how to trace on top of video to reveal the negative spaces that groups of people create through interaction. His field inscriptions method involves layering acetate onto a laptop screen, pausing the video and tracing lines of sight, bodily orientations and shared fields of movement.
Finally, Toby Harris demonstrated the prototype and discussed extending the app to use motion capture data to review and annotate scenes from multi-point perspective. Pat Healey also showed his and Sophie’s work collecting and visualizing interactional data from a pair of trousers embedded with sensors to show the non-visible postural shifts of seated interlocutors.
More info and future possibilities:
We are now collecting feature requests for another round of development, so please leave your ideas and comments on the project site or email me