Researchers at the University of Waterloo have developed a system that enables swarms of mobile robots to translate music into coordinated light-based artworks. Led by Dr. Gennaro Notomista from the Department of Electrical and Computer Engineering, the project explores how distributed robotic control and real-time signal processing can be combined to create visual interpretations of sound.
Nam, Y., Tam, A. T., Reynolds, T. E., Rojas, D. N., Brekan, J. A., Sil, S., & Scheidt, K. A. (2026). Synthesis and Structural Confirmation of Secalosides A and B. Journal of the American Chemical Society, 148(1), 86–92. https://doi.org/10.1021/jacs.5c18864
The system uses multiple small wheeled robots, each roughly the size of a soccer ball, operating within a defined floor area that acts as a canvas. As music plays, the robots move across the surface while projecting colored light trails. A camera positioned above records their motion, capturing the evolving patterns as a composite image. The resulting output resembles a long-exposure photograph, effectively turning coordinated movement into a light painting.
Dr. Gennaro Notomista from University of Waterloo stated,
“We included the human control input to allow people and robots to work together. The human painter should complement and be complemented by what the robots do.”
At its core, the project addresses two engineering challenges. The first involves multi-agent coordination. Controlling a swarm requires algorithms that ensure robots avoid collisions, maintain spatial coverage, and respond cohesively to shared input signals. The team tested the system with up to 12 robots, though the underlying framework is designed to scale to larger groups. The control strategy allows each robot to respond individually while still contributing to a unified visual outcome.
The second challenge centers on music analysis. To generate meaningful visual responses, the system extracts features from the audio stream in real time. Parameters such as tempo, chord progression, and dynamic variation are processed and mapped to motion characteristics. Speed, direction, position, color, and light intensity are adjusted based on these inputs. For example, changes in tempo can alter movement velocity, while harmonic shifts influence color selection or spatial distribution across the canvas.
Unlike earlier robotic art systems that follow predefined patterns, this approach emphasizes adaptability. The robots do not replay fixed trajectories. Instead, they respond continuously to live musical input, producing patterns that vary from one performance to another. According to project descriptions presented at the 2025 IEEE International Conference on Advanced Robotics and its Social Impacts, the objective was to design a cohesive framework in which distributed control and creative output operate together rather than independently.
Human interaction is built into the system. Participants can modify certain parameters in real time, such as the width of light trails or their placement within the canvas boundary. This creates a collaborative setting where people influence the swarm without directly controlling each robot. The intent is not to replace human creativity, but to explore shared authorship between algorithmic systems and users.
The research has been documented in a paper titled Music-driven Robot Swarm Painting by Notomista and Jingde Cheng, a former graduate student at Waterloo. The publication outlines both the control architecture and the signal processing pipeline used to interpret musical structure. While the artistic output draws attention, the technical contributions relate more broadly to coordinated autonomy.
Lessons from this work extend beyond interactive installations. Swarm robotics is an active area of research for applications such as environmental monitoring, precision agriculture, warehouse logistics, and search and rescue operations. Systems that require decentralized coordination under dynamic input conditions share similarities with the challenges addressed here. Real-time adaptation, distributed decision-making, and scalable control are central themes in both creative and industrial contexts.
The Waterloo team has indicated plans to collaborate with professional musicians and visual artists to evaluate the platform in live settings. Public exhibitions and user studies are expected to assess how audiences interpret the robot-generated visuals and how effectively the system captures musical structure.
Although the project sits at the intersection of art and engineering, its technical foundations lie in control theory, signal processing, and multi-agent systems. By treating music as structured data rather than abstract inspiration, the researchers have demonstrated how robotic swarms can operate as responsive, coordinated systems under continuous input. The result is a framework that highlights both the expressive and practical potential of distributed robotics.

Adrian graduated with a Masters Degree (1st Class Honours) in Chemical Engineering from Chester University along with Harris. His master’s research aimed to develop a standardadised clean water oxygenation transfer procedure to test bubble diffusers that are currently used in the wastewater industry commercial market. He has also undergone placments in both US and China primarely focused within the R&D department and is an associate member of the Institute of Chemical Engineers (IChemE).