• Mashup Score: 2

    The transformer-based models, such as GPT-31 and DALL-E2, have achieved unprecedented breakthroughs in the field of natural language processing and computer vision. The inherent similarities between natural language and biological sequences have prompted a new wave of inferring the grammatical rules underneath the biological sequences. In genomic study, it is worth noting that DNA sequences alone…

    Tweet Tweets with this article
    • EpiGePT: a Pretrained Transformer model for epigenomics https://t.co/cfNpGmP20S #bioRxiv

  • Mashup Score: 1

    Deep neural networks display impressive performance but suffer from limited interpretability. Biology-inspired deep learning, where the architecture of the computational graph is based on biological knowledge, enables unique interpretability where real-world concepts are encoded in hidden nodes, which can be ranked by importance and thereby interpreted. In such models trained on single-cell…

    Tweet Tweets with this article
    • Reliable interpretability of biology-inspired deep neural networks https://t.co/gksTQAcexr #bioRxiv