RoBERTa: A Robustly Optimized BERT Pretraining Approach
arXiv.org, 2019-07 [Peer Reviewed Journal]2019. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. ;http://arxiv.org/licenses/nonexclusive-distrib/1.0 ;EISSN: 2331-8422 ;DOI: 10.48550/arxiv.1907.11692
Full text available