Predicting the Encoding Error of Implicit Neural Representations

23 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Coordinate networks, Implicit neural reresentations, SIREN, compression
TL;DR: Given SIREN network hyperparameters and a target image, we predict the encoding error that the SIREN will reach on that image.
Abstract: Implicit Neural Representations (INRs), which encode signals such as images, videos, and 3D shapes in the weights of neural networks, are becoming increasingly popular. Among their many applications is signal compression, for which there is great interest in achieving the highest possible fidelity to the original signal subject to constraints such as neural network size, training (encoding) and inference (decoding) time. Yet training INRs can be a computationally expensive process, making it costly to know if one has made the best possible tradeoff given such constraints. Towards this goal, we propose a novel problem: predicting the encoding error (training loss) that an INR will reach on a given training signal. We present a method which predicts the encoding loss that a popular INR network (SIREN) will reach, given its network hyperparameters and the signal to encode. Our predictive method demonstrates the tractability of this regression problem, and allows users to anticipate the encoding error that a SIREN network will reach in milliseconds instead of minutes or longer. We also offer insights into SIREN network behavior, such as why narrow SIRENs can have very high random variation in encoding loss, and how the performance of SIRENs relates to JPEG compression.
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 7833
Loading