TeacherActivityNet: A Novel Dataset for Monitoring Faculty Activities in Office Settings

ICLR 2025 Conference Submission13836 Authors

28 Sept 2024 (modified: 28 Nov 2024)ICLR 2025 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Workplace Monitoring, Activity Tracking, YOLOTAN, TeacherActivityNet, Computer Vision
Abstract: In this paper, we introduce a novel dataset for monitoring the activities of faculty members in academic office environments. Advances in computer vision have enabled the automation of workplace monitoring, particularly in educational institutions, where tracking faculty activities presents significant challenges and ethical considerations. Traditional methods of manual supervision are labor-intensive and prone to human error, underscoring the potential of automated video analysis as a more efficient solution. While substantial progress has been made in Human Activity Recognition (HAR) across various domains, research specifically focused on monitoring faculty activities in office settings is limited. Most existing studies concentrate on classroom and student monitoring, revealing a critical gap in faculty surveillance. This paper seeks to address that gap by introducing TeacherActivityNet, a novel video dataset designed to recognize teachers' activities in academic offices, encompassing nine distinct action classes. We tweak the YOLOv8n architecture to propose our model, Teacher Activity Net (YOLOTAN), which is then fine-tuned using our dataset, achieving an average precision of 74.9\%, significantly outperforming benchmark models. A comparative analysis of our dataset and methods against existing solutions highlights the potential of TeacherActivityNet to improve automated faculty monitoring systems. The dataset, trained models, and accompanying code are available at https://tinyurl.com/4ub94phh
Primary Area: applications to computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 13836
Loading