Keywords: Tensor Decomposition; Off-the-grid Sampling; Implicit Neural Representation (INR)
TL;DR: A continuous Tensor Ring network with implicit neural representation for efficient large-scale off-the-grid 5D data reconstruction.
Abstract: Due to cost and sampling constraints, real-world scientific data are often incomplete, posing significant challenges for data interpretation.
As a result, interpolation becomes a fundamental task in scientific data processing.
However, these constraints also lead to off-the-grid observations, making it difficult to directly apply conventional methods.
To address this, gridding strategies are commonly used to map irregular samples onto a regular grid, but this process inevitably introduces approximation errors.
Meanwhile, the rapid growth of data volume in large-scale surveys places significant demands on computational efficiency.
To address these issues, we propose a Continuous Tensor Ring Network (CTR-Net), which integrates Tensor Ring (TR) decomposition with Implicit Neural Representation (INR) for large-scale off-the-grid data reconstruction.
By combining continuous coordinate modeling with low-rank priors, CTR-Net effectively captures the intrinsic structure of high-dimensional data.
A LoRA-based progressive learning scheme enables scalable reconstruction over large survey areas.
To account for distinct temporal–spatial sampling characteristics, we introduce a decoupling strategy for temporal and spatial modeling, improving computational efficiency.
Experimental results show that CTR-Net achieves a favorable balance between reconstruction accuracy and computational efficiency for large-scale 5D seismic data reconstruction.
Submission Number: 65
Loading