Gabor Filters as Initializers for Convolutional Neural Networks: A Study on Inductive Bias and Performance on Image Classification

Published: 03 Jul 2023, Last Modified: 03 Jul 2023LXAI @ ICML 2023 Regular Deadline PosterEveryoneRevisionsBibTeX
Keywords: convolutional neural networks, machine learning, deep learning, computer vision
TL;DR: This study demonstrates that integrating Gabor filters into the receptive layer of CNNs improves image classification performance and reduces training time compared to traditional random initialization techniques.
Abstract: This study explores the impact of Gabor filters on Convolutional Neural Networks (CNNs) performance in image classification tasks. Prior research has indicated that the receptive filters of CNNs often resemble Gabor filters, suggesting their potential as initial receptive filters. We conducted an extensive analysis on various general object datasets, demonstrating that integrating Gabor filters in the receptive layer enhances CNN performance, as evidenced by improved accuracy, higher Area Under the Curve (AUC), and reduced loss. Furthermore, our findings suggest that CNNs equipped with Gabor filters in the receptive layer can perform better in a shorter training period than traditional random initialization techniques.
Submission Type: Non-Archival
Submission Number: 23
Loading