Centering Creativity and Responsibility for AI Tools for Artists, Creatives and Makers

  • Led by Caroline Sinders, University Arts London
  • Partnered with Mozilla Foundation

This fellowship explores what artists want, in respect to AI tools, AI software and Art. The goal is to survey artists, creatives and makers and develop a prototype that fits their needs.

My academic challenge project addresses the theme of AI for Inspired Innovation, by focusing on how to create low-code, no-code AI tools for creatives that reflect how creatives work and what they want from AI, while also creating tools that are compliant with current UK regulation and forthcoming European AI act regulation. For this project, we are pleased to confirm our non-academic stakeholder as the Mozilla Foundation’s Becca Ricks. In previous work, with the University of Edinburgh’s New Real magazine, and in my own work with Feminist Data Set of designing tools, this preliminary research has revealed how the current state of low-code/no-code AI tooling, including generative AI, hinders some artists. Some artists want tools that allow for more nuance, more tooling and more in-depth abilities to impact the latent space of algorithms. This can help explain how the models and machine learning systems work.

Relatedly, this research seeks to answer: what do creatives and artists want from AI tools and how can these tools be built to better serve the creative class? An Oxford Internet Institute report on AI and Arts highlighted how developers who make AI models and AI tooling have a misunderstanding of what the creative process and outputs are, which can limit the tools. Luba Elliot, a curator interviewed for the report, described, “[The developers of AI models] focus on the aesthetics of an image. They’re much more interested in trying to replicate the styles of paintings of the past than in participating in current developments in contemporary art.” It’s this disconnect our project seeks to address, exploring how equitable and innovative AI can respect copyright and current and forthcoming AI regulation, and better centre how creatives work and what they need from tools. We aim to do this by proposing and testing prototypes for creating with AI.

Questions this project aims to explore include: how can we future-proof tools to be ‘ethical’ including Centering transparency, legibility and explainability, while also offering more fine-tuned and nuanced ways for creatives to make? Moving away from the limiting text prompt, what does more robust tooling look like (such as Adobe’s Firefly, Runway ML, etc.) and how can these current tools be improved upon? We seek to understand and document: what do creatives want—and how do they want to interact with AI (including generative AI)? We will explore the last 20 years of AI art including work by creatives and artists who make or assemble their own data sets, train their own classifiers, and generate their own models (Anna Ridler, Sofia Crespo and others); artists and creatives who build their own algorithms and machine learning systems (Adam Harvey, Rebecca Fiebrink and others); artists and creatives using Generative AI (Eryk Salvaggio and others), and the emergent spaces ‘in between’ generative AI and artists creating their own algorithms

Scroll to Top