Research Output
enunlg: a Python library for reproducible neural data-to-text experimentation
  Over the past decade, a variety of neural ar-chitectures for data-to-text generation (NLG) have been proposed. However, each system typically has its own approach to pre-and post-processing and other implementation details. Diversity in implementations is desirable, but it also confounds attempts to compare model performance: are the differences due to the proposed architectures or are they a byprod-uct of the libraries used or a result of pre-and post-processing decisions made? To improve reproducibility, we re-implement several pre-Transformer neural models for data-to-text NLG within a single framework to facilitate direct comparisons of the models themselves and better understand the contributions of other design choices. We release our library at https: //github.com/NapierNLP/enunlg to serve as a baseline for ongoing work in this area including research on NLG for low-resource languages where transformers might not be optimal .

  • Date:

    11 September 2023

  • Publication Status:

    Published

  • Funders:

    EPSRC Engineering and Physical Sciences Research Council

Citation

Howcroft, D. M., & Gkatzia, D. (2023). enunlg: a Python library for reproducible neural data-to-text experimentation. In Proceedings of the 16th International Natural Language Generation Conference: System Demonstrations (4-5)

Authors

Monthly Views:

Available Documents