Confirmation: I have read and agree with the workshop's policy on behalf of myself and my co-authors.
Track: tiny / short paper (2-4 pages excluding references; extended abstract format)
Keywords: single-cell, scRNA-seq, sparsity, batch effect, highly variable gene
TL;DR: This work uses sparse connections and parameterized sub-modules to learn biologically meaningful scRNA-seq representations and streamline the data processing workflow.
Abstract: scRNA-seq presents opportunities to investigate cellular activities at the single-cell level, but data acquisition and its unique characteristics pose distinct challenges in data processing. Traditional tools typically follow a multi-stage workflow involving a series of statistical adjustments and trade-offs. While being straightforward, this approach lacks overall optimization as a single tool, leading biologists to find inconsistent results with different tools. This work introduces scVAE, sparsely connected variational autoencoder for single-cell data processing, an integrated and multi-task tool for scRNA-seq data processing, leveraging self-supervised deep learning techniques to tackle existing challenges. The scVAE model includes two key components: a sparsely connected layer utilizes sparse representation to enable the model to operate on the full gene space without performing gene selection, and a batchified module learns batch-specific variations with parameterized sub-modules and calculates correction vectors. By reducing data processing overhead, the scVAE model improves the overall efficiency with a more streamlined workflow. Experiments and evaluations on real datasets reveal that scVAE produces interpretable and biologically meaningful results.
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Presenter: ~Tianyi_Liu5
Format: Yes, the presenting author will attend in person if this work is accepted to the workshop.
Funding: No, the presenting author of this submission does *not* fall under ICLR’s funding aims, or has sufficient alternate funding.
Submission Number: 30
Loading