Benchmarking neural lossless compression algorithms on multi-purpose astronomical image data

Published: 09 Oct 2024, Last Modified: 19 Nov 2024Compression Workshop @ NeurIPS 2024EveryoneRevisionsBibTeXCC BY 4.0
Keywords: astronomy, physics, astrophysics, compression, neural compression, computer vision, remote sensing
TL;DR: A large astrophysical raw imaging dataset curated for compression benchmarks, with an initial evaluation of neural and classical lossless compression methods.
Abstract: The site conditions that make astronomical observatories in space and on the ground so desirable—cold and dark—demand a physical remoteness that leads to limited data transmission capabilities. Such transmission limitations directly bottleneck the amount of data that can be acquired. Thus, improving data compression capabilities, which then allows for more data to be obtained, can directly benefit the scientific impact of observatories. Traditional methods for compressing astrophysical data are manually designed. Neural data compression, on the other hand, holds the promise of learning compression algorithms end-to-end from data while leveraging the spatial, temporal, and wavelength structures of astronomical images. This paper introduces [AstroCompress](https://huggingface.co/AnonAstroData): a neural compression challenge for astrophysics data, featuring four new datasets (and one legacy dataset) with 16-bit unsigned integer imaging data in various modes: space-based, ground-based, multi-wavelength, and time-series imaging. We provide code for easily accessing the data and benchmark seven compression methods (three neural and four non-neural, including all practical state-of-the-art algorithms). Our results indicate that neural compression techniques can enhance data collection at observatories, and provide guidance on the adoption of neural compression in scientific applications.
Submission Number: 103
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview