skip to main content
LOTERRE

LOTERRE

Search from vocabulary

Content language

| español français
Search help

Concept information

Preferred term

DeBERTa  

Definition(s)

  • "Transformer-based neural language model DeBERTa (Decoding-enhanced BERT with disentangled attention), which improves previous state-of-the-art PLMs [Pre-trained Language Models] using two novel techniques: a disentangled attention mechanism, and an enhanced mask decoder." (He et al., 2020).

Broader concept(s)

Synonym(s)

  • Decoding-enhanced BERT with disentangled attention

Bibliographic citation(s)

based on

has application field

has for input language

implements

is encoded in

is executed in

has for license

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-G64XHKJT-9

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 6/22/23