Quickly Scanning First-Person Videos with Egocentric Elastic Timelines

By | May 12, 2017

Keita Higuchi, Ryo Yonetani, and Yoichi Sato: “EgoScanning: Quickly Scanning First-Person Videos with Egocentric Elastic Timelines”. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ’17). ACM, New York, NY, USA, 5180-5190.

This work presents EgoScanning, a video fast-forwarding interface that helps users find important events from lengthy first-person videos continuously recorded with wearable cameras. This interface features an elastic timeline that adaptively changes playback speeds and emphasizes egocentric cues specific to first-person videos, such as hand manipulations, moving, and conversations with people, on the basis of computer-vision techniques. The interface also allows users to input which of such cues are relevant to events of their interest. Through our user study, we confirm that users can find events of interest quickly from first-person videos thanks to the following benefits of using the EgoScanning interface: 1) adaptive changing of playback speeds allows users to watch fast-forwarded videos more easily; 2) emphasized parts of videos can act as candidates of events actually significant to users; and 3) users are able to select relevant egocentric cues depending on events of their interest.