Abstract: Diagnosing aesthetic perception plays a crucial role in deepening our understanding of student creativity, emotional expression, and the pursuit of lifelong learning within art education. This task encompasses the evaluation and analysis of students' sensitivity, preference, and capacity to perceive and appreciate beauty across different sensory domains. Currently, this assessment frequently relies on subjective evaluations of student artworks. There are two limitations: 1) the diagnosis is possibly limited by instructors' bias and 2) the heavy workload of instructors for conducting comprehensive assessments. These limitations motivate us to ask: Can we automatically and objectively conduct aesthetic perception diagnosis? To this end, we propose an innovative deep hybrid framework, AestheNet, to automatically evaluate aesthetic perception by analyzing numerous collected student paintings. More especially, we first utilize convolutional neural networks to extract the significant features from the student artworks. Then, we employ the transformer model to capture the intricate relationships among multiple aesthetic perception dimensions for objective diagnosis. Finally, we validate the effectiveness of the framework by creating a new dataset consisting of 2153 paintings drawn by 675 students. These paintings are annotated by human experts from 77 dimensions based on domain expertise. Extensive experiments have shown the effectiveness of AestheNet in aesthetic perception diagnosis. AestheNet is dedicated to overcoming the subjectivity inherent in traditional assessment methods, providing a new, quantifiable, and standardized way to evaluate aesthetic perception. This research not only opens up new perspectives in understanding students' aesthetic development during the art education process but also explores the innovation of using artificial intelligence technologies in the assessment of art education.
Loading