skip to main content
LOTERRE

LOTERRE

Search from vocabulary

Content language

| español français
Search help

Concept information

Preferred term

ruBERT  

Definition(s)

  • A BERT pre-trained language model for Russian language.

Broader concept(s)

Bibliographic citation(s)

  • • Kuratov, Y., & Arkhipov, M. (2019). Adaptation of deep bidirectional multilingual transformers for Russian language. arXiv:1905.07213 [cs]. http://arxiv.org/abs/1905.07213

based on

has application field

has design country

  • Russia

has for input language

implements

is executed in

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-R4K2SPTG-R

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 6/20/24