LLMs4Implicit-Knowledge-Generation Public

DOI

Code for equipping pretrained language models (BART, GPT-2, XLNet) with commonsense knowledge for generating implicit knowledge statements between two sentences, by (i) finetuning the models on corpora enriched with implicit information; and by (ii) constraining models with key concepts and commonsense knowledge paths connecting them.

Source Code also available at GitHub: https://github.com/Heidelberg-NLP/LMs4Implicit-Knowledge-Generation.

Identifier
DOI https://doi.org/10.11588/data/5VTJ26
Metadata Access https://heidata.uni-heidelberg.de/oai?verb=GetRecord&metadataPrefix=oai_datacite&identifier=doi:10.11588/data/5VTJ26
Provenance
Creator Becker, Maria
Publisher heiDATA
Contributor Becker, Maria
Publication Year 2024
Funding Reference DFG SPP-1999 ; DFG FR1707/-4-1 ; Leibniz-Gesellschaft und Ministerium für Wissenschaft, Forschung und Kunst Baden-Württemberg SAS-2015-IDS-LWC
Rights info:eu-repo/semantics/openAccess
OpenAccess true
Contact Becker, Maria (Heidelberg, University, Department of Computational Linguistics)
Representation
Resource Type Dataset
Format text/x-python; application/pdf; text/markdown
Size 3900; 17070; 10317; 10312; 13456; 112924; 6962; 10417; 10215
Version 1.0
Discipline Humanities