Concept information
Preferred term
PolishBERT
Definition(s)
- RoBERTA models for Polish.
Broader concept(s)
Bibliographic citation(s)
- • Dadas, S., Perełkiewicz, M., & Poświata, R. (2020). Pre-training Polish transformer-based language models at scale. arXiv:2006.04229 [cs]. http://arxiv.org/abs/2006.04229
based on
has application field
has design country
- Poland
has for input language
has repository
implements
is encoded in
is executed in
has for license
In other languages
-
French
URI
http://data.loterre.fr/ark:/67375/LTK-NNRZL6PQ-4
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}