CMU & Google XLNet Tops BERT; Achieves SOTA Results on 18 NLP Tasks | Synced

A team of researchers from Carnegie Mellon University and Google Brain have now proposed XLNet, a new language model which outperforms BERT on 20 language tasks including SQuAD, GLUE, and RACE; and...

By · · 1 min read

Source: Synced | AI Technology & Industry Review

A team of researchers from Carnegie Mellon University and Google Brain have now proposed XLNet, a new language model which outperforms BERT on 20 language tasks including SQuAD, GLUE, and RACE; and has achieved SOTA results on 18 of these tasks.