BERT: Pretraining of Deep Bidirectional Transformers for Language Understanding

Тип публикация:

Conference Proceedings

Източник:

NAACL-HLT 2019, Minneapolis, Minnesota, p.4171–4186 (2019)
Share/Save