Lookup NU author(s): Dr Ehsan Toreini,
Dr Mhairi Aitken,
Dr Kovila Coopamootoo,
Dr Karen Elliott,
Professor Aad van Moorsel
This is the authors' accepted manuscript of a conference proceedings (inc. abstract) that has been published in its final definitive form by ACM, 2020.
For re-use rights please refer to the publisher's terms and conditions.
To design and develop AI-based systems that users and the larger public can justifiably trust, one needs to understand how machine learning technologies impact trust. To guide the design and implementation of trusted AI-based systems, this paper provides a systematic approach to relate considerations about trust from the social sciences to trustworthiness technologies proposed for AI-based services and products. We start from the ABI+ (Ability, Benevolence, Integrity, Predictability) framework augmented with a recently proposed mapping of ABI+ on qualities of technologies that support trust. We consider four categories of trustworthiness technologies for machine learning, namely these for Fairness, Explainability, Auditability and Safety (FEAS) and discuss if and how these support the required qualities. Moreover, trust can be impacted throughout the life cycle of AI-based systems, and we therefore introduce the concept of Chain of Trust to discuss trustworthiness technologies in all stages of the life cycle. In so doing we establish the ways in which machine learning technologies support trusted AI-based systems. Finally, FEAS has obvious relations with known frameworks and therefore we relate FEAS to a variety of international 'principled AI' policy and technology frameworks that have emerged in recent years.
Author(s): Toreini E, Aitken M, Coopamootoo K, Elliott K, Zelaya CG, van Moorsel A
Publication type: Conference Proceedings (inc. Abstract)
Publication status: Published
Conference Name: FAT* '20: 2020 Conference on Fairness, Accountability, and Transparency
Year of Conference: 2020
Online publication date: 27/01/2020
Acceptance date: 01/12/2019
Date deposited: 02/03/2020