[1810.04805] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

amarashar's bookmarks 2018-10-12

Summary:

BERT is conceptually simple and empirically powerful. It obtains new state-of-the-art results on eleven natural language processing tasks, including pushing the GLUE benchmark to 80.4% (7.6% absolute improvement), MultiNLI accuracy to 86.7 (5.6% absolute improvement) and the SQuAD v1.1 question answering Test F1 to 93.2 (1.5% absolute improvement), outperforming human performance by 2.0%.

Link:

https://arxiv.org/abs/1810.04805

From feeds:

Ethics/Gov of AI ยป amarashar's bookmarks

Tags:

Date tagged:

10/12/2018, 14:30

Date published:

10/12/2018, 10:30