skip to main content
LOTERRE

LOTERRE

Search from vocabulary

Content language

| español français
Search help

Concept information

Preferred term

StructBERT  

Definition(s)

  • "a new type of contextual representation, StructBERT, which incorporates language structures into BERT pre-training by proposing two novel linearization strategies. Specifically, in addition to the existing masking strategy, StructBERT extends BERT by leveraging the structural information: word-level ordering and sentence-level ordering." (Wang et al., 2019).

Broader concept(s)

Bibliographic citation(s)

  • • Wang, W., Bi, B., Yan, M., Wu, C., Bao, Z., Xia, J., Peng, L., & Si, L. (2019). StrutcBERT: incorporating language structures into pre-training for deep language understanding. https://arxiv.org/abs/1908.04577v3

based on

has application field

has design country

  • China

has for input language

implements

is encoded in

is executed in

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-N4MFP3CQ-3

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 6/20/24