Hitloop
An AI-art installation that explores the Human-In-The-Loop principle trough collaborative noise-making
ProjectHitloop is an AI-art installation that explores the level of control humans have over AI-desicions. This installation uses music as a metaphor for AI decision-making, to show and experience how human oversight works.
Research: who is in charge: humans or AI?
Artificial Intelligence is ugly and it makes mistakes. To account for errors, humans can train AI systems and take control if things go wrong. This is called Human In The Loop or Human Oversight: the supervision of an AI system by at least one natural person, to influence its effects.
Why is this research important?
Human oversight is crucial for reducing risks in high-risk AI applications in areas like healthcare, social services, and criminal justice. It is suggested as a safeguard against the negative effects of these AI applications. Guidelines and laws, such as the EU AI Act, stress the need for effective human oversight, but they do not provide clear instructions on how to implement it.
However, human oversight has significant challenges and can be unreliable at times. People that train AI make mistakes, are inconsistent, and can be biased. Humans find it hard to judge and correct AI outputs, especially in complex situations, leading to badly trained AI. This highlights the need for better strategies to help humans overcome this. Our research aims to develop new and more human-centered approaches for oversight.
Testing at Eurosonic/ Noorderslag
By testing a collaborative art installation at Eurosonic/ Noorderslag ES/NS where AI, the audience and a DJ complement each other, we try to visualize different levels of human control over AI-descision-making, and make sure that AI output meets the audiences’ minimal standards of aesthetics and ethics.
How does it work?
This AI-art installation uses WAIVE: an open-source application from Thunderboom Records. WAIVE generates musical ”instruments” using street sounds and noises from the public domain. The AI then generates musical tracks built-up from the different AI instruments. The Human DJ selects AI-generated tunes, and people in the audience can grab one of the eight handheld controllers. Through the controllers, the audience select an instrument in the song, and change the way it sounds. The Human DJ selects corrects the final sound output.
This experiment is made possible with support from Regieorgaan SIA, Innofest, ES/NS and Thunderboom Records.