Toggle Main Menu Toggle Search

Open Access padlockePrints

Generating Textual Explanations for Machine Learning Models Performance: A Table-to-Text Task

Lookup NU author(s): Dr Amir EnshaeiORCiD, Dr Noura Al Moubayed

Downloads


Licence

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0).


Abstract

Numerical tables are widely employed to communicate or report the classification performance of machine learning (ML) models with respect to a set of evaluation metrics. For non-experts, domain knowledge is required to fully understand and interpret the information presented by numerical tables. This paper proposes a new natural language generation (NLG) task where neural models are trained to generate textual explanations, analytically describing the classification performance of ML models based on the metrics’ scores reported in the tables. Presenting the generated texts along with the numerical tables will allow for a better understanding of the classification performance of ML models. We constructed a dataset comprising numerical tables paired with their corresponding textual explanations written by experts to facilitate this NLG task. Experiments on the dataset are conducted by fine-tuning pre-trained language models (T5 and BART) to generate analytical textual explanations conditioned on the information in the tables. Furthermore, we propose a neural module, Metrics Processing Unit (MPU), to improve the performance of the baselines in terms of correctly verbalising the information in the corresponding table. Evaluation and analysis conducted indicate, that exploring pre-trained models for data-to-text generation leads to better generalisation performance and can produce high-quality textual explanations.


Publication metadata

Author(s): Ampomah I, Burton J, Enshaei A, Al Moubayed N

Publication type: Conference Proceedings (inc. Abstract)

Publication status: Published

Conference Name: Proceedings of the 13th Conference on Language Resources and Evaluation (LREC 2022)

Year of Conference: 2022

Pages: 3542‑3551

Online publication date: 20/06/2022

Acceptance date: 07/04/2022

Date deposited: 17/06/2022

Publisher: European Language Resources Association (ELRA)

URL: http://www.lrec-conf.org/proceedings/lrec2022/pdf/2022.lrec-1.379.pdf

ePrints DOI: 10.57711/n57q-xr41


Share