STAF: Sinusoidal Trainable Activation Functions for Implicit Neural Representation

27 Sept 2024 (modified: 23 Jan 2025)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Implicit Neural Representation, Activation Functions, Parametric Activation Functions, Neural Tangent Kernel
TL;DR: STAF, new sinusoidal trainable activation functions, to overcome ReLU networks' spectral bias, improving high-frequency detail signal reconstruction. They learn faster, achieve higher accuracy, and outperform methods like KAN and SIREN in PSNR.
Abstract: Implicit Neural Representation (INR) has emerged as a promising method for characterizing continuous signals. This paper addresses the spectral bias exhibited by conventional ReLU networks, which hampers their ability to reconstruct fine details in target signals. We introduce Sinusoidal Trainable Activation Functions (STAF), designed to model and reconstruct diverse complex signals with high precision. STAF mitigates spectral bias, enabling faster learning of high-frequency details compared to ReLU networks. We demonstrate STAF's superiority over state-of-the-art networks such as KAN, WIRE, SIREN, and Fourier features, achieving higher accuracy and faster convergence with superior Peak Signal-to-Noise Ratio (PSNR). Our extensive experimental evaluation establishes STAF's effectiveness in improving the reconstruction quality and training efficiency of continuous signals, making them valuable for various applications in computer graphics and related fields.
Supplementary Material: zip
Primary Area: applications to computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9114
Loading