4 datasets found

FundingReference: info:eu-repo/grantAgreement/EC/H2020/825153 Keywords: BERT

Filter Results
  • Slovenian RoBERTa contextual embeddings model: SloBERTa 1.0

    The monolingual Slovene RoBERTa (A Robustly Optimized Bidirectional Encoder Representations from Transformers) model is a state-of-the-art model representing words/tokens as...
  • CroSloEngual BERT

    Trilingual BERT (Bidirectional Encoder Representations from Transformers) model, trained on Croatian, Slovenian, and English data. State of the art tool representing...
  • Slovenian RoBERTa contextual embeddings model: SloBERTa 2.0

    The monolingual Slovene RoBERTa (A Robustly Optimized Bidirectional Encoder Representations from Transformers) model is a state-of-the-art model representing words/tokens as...
  • CroSloEngual BERT 1.1

    Trilingual BERT (Bidirectional Encoder Representations from Transformers) model, trained on Croatian, Slovenian, and English data. State of the art tool representing...
You can also access this registry using the API (see API Docs).