Myvideo

Guest

Login

DL2022: Трансформер (часть 2)

Uploaded By: Myvideo
3 views
0
0 votes
0

Курс “Глубокое обучение (Deep Learning)“ страница курса: автор курса: Александр Дьяконов () В этой лекции... BERT = Bidirectional Encoder Representations from Transformers. RoBERTa: A Robustly Optimized BERT Pretraining Approach. SpanBERT. ALBERT = A Lite BERT. T5: Text-To-Text Transfer Transformer. ELECTRA = Efficiently Learning an Encoder that Classifies Token Re-placements Accurately.

Share with your friends

Link:

Embed:

Video Size:

Custom size:

x

Add to Playlist:

Favorites
My Playlist
Watch Later