skip to main content
LOTERRE

LOTERRE

Search from vocabulary

Content language

| español français
Search help

Concept information

Preferred term

PolishBERT  

Definition(s)

  • RoBERTA models for Polish.

Broader concept(s)

Bibliographic citation(s)

  • • Dadas, S., Perełkiewicz, M., & Poświata, R. (2020). Pre-training Polish transformer-based language models at scale. arXiv:2006.04229 [cs]. http://arxiv.org/abs/2006.04229

based on

has application field

has design country

  • Poland

has for input language

implements

is encoded in

is executed in

has for license

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-NNRZL6PQ-4

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 6/20/24