Attention to Bayes: Self-Attention and Bayesian Methods for Predicting 3D Chromatin Folding From DNA SequencesDownload PDF


02 Feb 2022 (modified: 05 May 2023)Submitted to ProjectX2021Readers: Everyone
Keywords: transformer, bioinformatics
Abstract: We present a novel model and approach, combining Transformers and VAE-like model that builds upon the established Akita/Basenji CNN architecture to enable exciting new possibilities of predicting 3D chromatin folding from DNA sequences. We establish the motivation for using self-attention and a simple yet feasible Bayesian extension, derived from the usual posterior predictive distribution. Finally, we analyse the model, demonstrate the benefits and novelty of our approach, and discuss meaningful next steps.
5 Replies