Concept information
Preferred term
BioMed-RoBERTa
Definition(s)
- A RoBERTa Language model pretrained on biomedical texts.
Broader concept(s)
Bibliographic citation(s)
- • Gururangan, S., Marasović, A., Swayamdipta, S., Lo, K., Beltagy, I., Downey, D., & Smith, N. A. (2020). Don’t Stop Pretraining: Adapt Language Models to Domains and Tasks. ArXiv:2004.10964 [Cs]. http://arxiv.org/abs/2004.10964
based on
has design country
- United States
has for input language
implements
is encoded in
is executed in
In other languages
-
French
URI
http://data.loterre.fr/ark:/67375/LTK-TNGSLL0L-5
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}