Neural Beatbox (2020) is a website where the AI analyzes various sounds recorded by the user with a webcam and generates new rhythms. This project allows for the collaboration between multiple users online and the AI system; while the AI guides the creative process and makes decisions, the content itself only comes from humans. The way to play is simple: record sounds with familiar objects such as vocalizations, clapping, playing instruments, or by using pens and cups on the desk. The AI then analyzes the sounds and generates beats. Users can add sounds as many times as they like. One can also play with friends online by sharing a link to the session. It also processes video, allowing users to enjoy seeing their expressions while recording. Whether you understand Human Beatboxing or not, anyone with a computer can experience it.
Development (front-end) – Robin Jungers
Development (back-end) – Bogdan Teleaga
Design – Naoki Ise
Machine learning & direction – Nao Tokui
Machine learning – Christopher Mitcheltree
Project management – Yumi Takahashi
Interactive website is here: https://neuralbeatbox.net/
Since this is an interactive piece of art, the more people use a session simultaneously, the more interesting the results are.
2 examples of pre-populated sessions (that can be modified or remixed by anyone viewing them):