In this project, you will create a new digital musical instrument and try it out in musical performance. You might use new sensors, embedded electronics, smartphones or tablet computers, or some combination of these. Your work will involve sensing musical actions from a human, processing these into sound, and exploring the kinds of music your new interface can create.
This is actually a whole field and there's an exciting conference (https://nime.org) that you could attend to show off a successful project.
Some ideas could be:
- Using the Bela embedded sensor platform to translate new sensor data into sound.
- Using Tensorflow Lite and Arduino to create a machine learning enabled musical instrument.
- Using a VR system to create a virtual musical world.
- Create a smartphone app that allows friends to jam together wherever they are
- Understand fundamental concepts of musical interface design and development
- Create a new interface for musical expression
- Evaluate the new interface through performance, improvisation, or an HCI study.
Applicants for this project should have experience or motivation to learn about microcontroller programming (e.g., COMP2300), and a strong motivation to work with music and sound.
Experience with creative computing such as COMP1720 (Art and Interaction in New Media) and particularly COMP2710 - Laptop Ensemble is recommended.
This project requires a personal motivation and desire to create a new kind of musical interface.
music, sensors, interaction, hardware, embedded