Interaction Techniques for Comparing Video

Published: 13 May 2024, Last Modified: 28 May 2024GI 2024 SDEveryoneRevisionsBibTeXCC BY 4.0
Letter Of Changes: Thanks to the reviewers for their valuable comments and suggestions. We have revised the paper to address all of the issues raised in the reviews, and we believe that this has resulted in a much stronger paper. The summary below is organized around the main comments made by the meta-reviewer and the other reviewers. - Framing of the user evaluation - We have changed the presentation of the user evaluation throughout the paper, to emphasize that this is a "preliminary evaluation"; this has resulted in several changes (e.g., we have revised our statement of contributions to reduce the weight of the evaluation). We have also provided our rationale for the choice of evaluation (i.e., that it is difficult to carry out a standard A/B comparison when several of the comparison tasks that we would be testing are not possible in current tools). We have also revised the Discussion to provide a more specific indication of the limitations that arise from our evaluation choice, and have added text to our discussion of future work to state that further and more detailed evaluations are important next steps, including additional tests to compare with common free-use video editing and comparison software such as AVISynth and VideoCompare. - Missing detail in the user evaluation - We have revised the description of the evaluation based on reviewer comments: we have added a description of pitch break and why it is desirable; we have removed participant gender counts and included standard deviations; we have added a description of our scaling scheme to the captions of Table 2 and 3; and we have included additional think aloud comments from the live recordings. - Origin of the user questions and comparison tasks - We have revised our framework section to add detail about the origin of user questions and tasks (i.e., that these are derived from previous work on comparison primitives in information visualization, and also informed by our work with domain experts who need to carry out comparison tasks in plant science). - Lack of clarity in some parts of the paper - We have done a full edit to revise unclear sections and streamline the flow of the paper; we have added a description of how task solutions in the analytical evaluation were chosen; we have changed the description of the questionnaire questions from Likert-style to semantic-anchors; we chose to use 7 and 5 for our usability and utility questionnaires respectively because they fit the semantics of each question better and therefore were not altered. - Name of toolkit - We have changed "Visual Comparison Toolkit" to "Video Comparison Toolkit" throughout the paper, and ensured that capitalization is consistent. - Figure placement - We have moved Figure 1 to the top of the page so that it comes before Figure 2 and 3 to help with legibility. Thanks again to the reviewers for the valuable comments and suggestions.
Keywords: video comparison, visual comparison, interaction techniques
Abstract: Comparison is a well-studied task in visual analytics, but there is still little support for comparison of temporal streams such as video. There are a wide range of tasks that involve video comparison, but there are very few systems or techniques to support this kind of analysis. To help address this problem, we have developed new interaction techniques that explicitly support video comparison. We provide techniques for equalizing the reference frame of videos to be compared, juxtaposition techniques for enhancing side-by-side and small-multiples comparisons, superposition techniques for comparing overlaid videos, explicit-encoding techniques that visualize differences between extracted points, and temporal-to-linear techniques that translate between a temporal sequence of frames and a 1D timeline. We built a demonstration system with five different datasets, and evaluated our interaction techniques in two ways: an analysis of steps to show their efficiency, and a preliminary user study to explore learnability, utility, and usability.
Supplementary Material: zip
Video: zip
Submission Number: 26
Loading