Главная
Study mode:
on
1
Introduction
2
Antonio Van Lovin Hook
3
CGSI YouTube
4
Presentation Preparation
5
Recent History
6
Large Language Models
7
Foundation Models
8
Transformer
9
Selfattention
10
Attention
11
Transformer Architecture
12
Pretraining
13
Applications
14
Models
15
Tokenization
16
Masking
17
DNA Bird 2
18
nucleotide Transformer
19
SD Bert Model
20
SCGPT
21
generative pretraining
22
SC Foundation model
23
Open question
24
Summary
Description:
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only! Grab it Explore the intersection of large language models and computational biology in this 49-minute lecture by Jian Ma at the Computational Genomics Summer Institute (CGSI). Delve into the recent history of large language models and foundation models, understanding their architecture and applications in genomics. Learn about the Transformer model, self-attention mechanisms, and various tokenization techniques specific to biological sequences. Discover specialized models like DNA Bird 2, nucleotide Transformer, SD Bert Model, and SCGPT, designed for genomic data analysis. Examine the concept of generative pretraining and its relevance to computational biology. Conclude with a discussion on open questions and a summary of the potential impact of these technologies on genomic research.

Large Language Models for Computational Biology - A Primer

Computational Genomics Summer Institute CGSI
Add to list
0:00 / 0:00