Concept information
Preferred term
ELECTRA
Definition(s)
- Pre-trained language model "by replacing some tokens with plausible alternatives sampled from a small generator network." (Clark et al., 2020).
Broader concept(s)
Synonym(s)
- Efficiently Learning an Encoder that Classifies Token Replacements Accurately
Bibliographic citation(s)
- • Clark, K., Luong, M.-T., Le, Q. V., & Manning, C. D. (2020). ELECTRA : Pre-training text encoders as discriminators rather than generators. arXiv:2003.10555 [cs]. http://arxiv.org/abs/2003.10555
base of
has application field
has design country
- United States
has for input language
has repository
implements
is encoded in
is executed in
has for license
In other languages
-
French
URI
http://data.loterre.fr/ark:/67375/LTK-QB1X83XF-5
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}