New Interfaces for Musical Expression

Description

In this project, you will create a new digital musical instrument and try it out in musical performance. You might use new sensors, embedded electronics, smartphones or tablet computers, or some combination of these. Your work will involve sensing musical actions from a human, processing these into sound, and exploring the kinds of music your new interface can create.

This is actually a whole field and there's an exciting conference (https://nime.org) that you could attend to show off a successful project.

Some ideas could be:

- Using the Bela embedded sensor platform to translate new sensor data into sound.

- Using Tensorflow Lite and Arduino to create a machine learning enabled musical instrument.

- Using a VR system to create a virtual musical world.

- Create a smartphone app that allows friends to jam together wherever they are

Goals

- Understand fundamental concepts of musical interface design and development

- Create a new interface for musical expression

- Evaluate the new interface through performance, improvisation, or an HCI study.

Requirements

Applicants for this project should have experience or motivation to learn about microcontroller programming (e.g., COMP2300), and a strong motivation to work with music and sound.

Experience with creative computing such as COMP1720 (Art and Interaction in New Media) and particularly COMP2710 - Laptop Ensemble is recommended.

This project requires a personal motivation and desire to create a new kind of musical interface.

Keywords

music, sensors, interaction, hardware, embedded

Updated:  1 June 2019/Responsible Officer:  Head of School/Page Contact:  CECS Marketing