Creating a dynamic archive of responsible ecosystems in the context of creative AI

  • Led by Professor Lydia Farina, University of Nottingham

This project seeks to develop an insight into what might actually constitute responsible AI in the context of creative AI. It involves examining the ethical and moral tension arising between the concepts of creativity, authenticity and responsibility.

Identifying stakeholders and delineating boundaries of ecosystems is especially challenging in the context of Creative AI applications as users of these applications including artists, the public, local authorities or national institutions, do not typically participate in their development. However, defining boundaries and stakeholders is important because different RAI considerations apply if the boundaries are drawn narrowly or widely; drawing narrow boundaries risks excluding relevant stakeholders from these ecosystems. Drawing boundaries widely risks any considerations of RAI becoming too complex to apply or identify. A key objective of this project is to develop the structure of a dynamic archive that can be used to identify the stakeholders in these ecosystems in the context of Creative AI. A parallel objective is to use the archived data to highlight the RAI considerations that need to be addressed in each project and to compare the boundaries of these AI ecosystems. This will generate deeper insight into how responsibility within AI ecosystems itself can be better understood.
This project will lay the foundation work for mapping RAI ecosystems in the context of Creative AI by using bottom-up evidence already collected in specific research projects. We interpret AI ecosystems as interlinked ecosystems consisting of different individual actors and groups interacting in complex ways with one another and with AI applications. These ecosystems are Responsible AI Ecosystems when they pay close attention to responsibility challenges as identified by RAI UK research. The project will have two stages:
  1. a rigorous analysis of Digiscore (https://digiscore.github.io/pages/aboutus/Creative) which uses AI to create music for disabled musicians and CAT Royale (https://www.blasttheory.co.uk/projects/cat-royale) which uses AI to interact with pet animals and
  2. a less intensive analysis of at least 5 additional projects.
From this will emerge new and innovative insight into what might actually constitute responsible creative AI, its characteristics and features, its limitations and risks. For example, our analysis will involve examining the ethical and moral tension arising between the concepts of creativity, authenticity and responsibility, and will explore the different types of responsibility e.g. moral, legal, role or virtue attaching to this context.
Creative AI offers a way of extending and enriching human-based creativity, where AI becomes a benign collaborator enabling humans to break the constraints of established practice. From this perspective, the data curator for the AI becomes a key creative role within the AI ecosystem. This project will provide a scoping study of what might constitute responsible AI practice in this highly contested and emergent area. This specific context is important because creative industries are important economically and socially while creativity itself is underpinned by a range of ethical and epistemological considerations which impinge directly on any notion of responsible AI: we cannot simply import approaches to responsibility created for other sectors such as banking or financial services. Perhaps equally importantly, time is running out to get this right: a recent study by Stanford University found that none of the leading AI models in all sectors came close to being compliant with the draft European Union AI Act.
Scroll to Top