Dr Charles Martin

Charles Martin is a specialist in percussion, music technology, and musical AI from Australia. He links percussion with electroacoustic music and other media through new technologies. He is the author of musical iPad app, PhaseRings, and founded touchscreen ensemble, Ensemble Metatone, percussion group, Ensemble Evolution, and cross-artform group, Last Man to Die. Charles’ doctoral research involved developing intelligent agents that mediate ensemble performance.

Charles was a postdoctoral fellow at the University of Oslo in the Engineering Prediction and Embodied Cognition (EPEC) project and the RITMO Centre for Interdisciplinary Studies in Rhythm, Time, and Motion from 2016–2019 where he developed new ways to predict musical intentions and performances in smartphone apps and embedded devices.

Charles is now lecturer in computer science at the Australian National University.

Research interests:

  • new interfaces for musical performance
  • computational creativity
  • smartphone/tablet musical instruments
  • collaborative performance
  • co-creative interfaces
  • improvisation
  • percussive approaches to computer music

My research work is at the nexus of music technology, human-computer interaction and machine learning. I'm convinced that we can use computing systems to unlock new ways for people to be creative and enrich our everyday lives with new kinds of music and art. To do this, we need to imagine new kinds of computers, including embedding computers into new physical settings and to exploit machine learning to allow these computer systems to understand and interact with our complex creative activities.

My work involves:

  • new interfaces for musical performance
  • computational creativity
  • smartphone/tablet musical instruments
  • collaborative performance
  • co-creative interfaces
  • improvisation
  • percussive approaches to computer music

I'm available to supervise student projects in computational creativity, creativity support systems, music technology, and interactive systems. I'm interested in creating and studying new computing systems at the nexus of creative arts, physical embodiment, interaction, and intelligence.

Projects could involve

  • developing predictive musical instruments
  • machine learning of musical style
  • musical AI
  • computer support for collaborative musical expression
  • new interfaces for musical expression (NIME)
  • applying ML/AI in creative practices

Get in touch if you would like to work with me in these areas! I'm particularly interested in supervising projects that involve creating new computer systems and then testing them in artistic application. (see below for ideas!)

 

Here’s some project ideas that could be extended or shaped to suit you:

Discriminating real from neural generated music

Most generative music systems just create new music, this project would involve training discriminator neural networks to tell if music was created by a human, or a cutting edge ANN such as Music Transformer or PerformanceRNN.

Generating Harmony from Melody

This project involves creating a sequence-to-sequence ANN that can generate harmony, or counter-melodic parts for a given melody. The focus here would be to deploy this network in a music creating app such as MicroJam.

Generating “improved” versions of a melody/MIDI content

The idea of this project is to take an existing melody created in an interactive music system and change its style, or improve it in some way. It could involve modifying the melody to fit a particular harmony, or create a more appealing melodic shape. Various sequence-to-sequence ANN techniques could be applied here, and such a system could have applications in musical performance and education.

Neural Networks for Generating Digital Audio

The idea of generating digital audio directly from a neural network is exciting and there are stunning results from examples such as Wavenet, SampleRNN, FFTNet, and WaveGan. Projects in this area will involve finding new application areas for neural audio generation and developing focussed and fast algorithms with appropriate training data for use in these areas. In particular, the use of these networks for creative arts is still being actively explored.

Evolutionary Music Making

Making music with evolutionary algorithms has a long history, but is yet to break into mainstream music technology systems. In this project, you will develop an interactive music application (e.g., web or mobile app) for generating and assessing music created using an evolutionary algorithm. A novel approach might involve the human user evaluating some generated sounds, while others are analysed by an automatic system using MIR techniques or a discriminator neural network.

New Applications of the Mixture Density Recurrent Neural Network

The MDRNN is an exciting sequence model that can generate multiple continuous valued signals from a Gaussian mixture model at each step in time. This project will involve applying and extending my Keras MDRNN models into new applications in the creative arts and beyond. We’ve tried using the MDRNN for voice synthesis, motion capture data synthesis, and musical control data synthesis, but there are lots of other potential applications waiting for you to discover, e.g.: predicting future sensor values, generating robot movements, generating world models for video games or real life situations etc.

Book Chapters

Charles P. Martin and Henry Gardner. Free-Improvised Rehearsal-as-Research for Musical HCI. In Simon Holland, Tom Mudd, Katie Wilkie-McKenna, Andrew McPherson, and Marcelo M. Wanderley, editors, New Directions in Music and Human-Computer Interaction, Springer Series on Cultural Computing, pages 269--284. Springer, Cham, February 2019. [ bib | DOI | preprint | .pdf ]
 
Charles Martin. Pursuing a sonigraphical ideal at the dawn of the NIME epoch. A commentary on “Sonigraphical Instruments: From FMOL to the reacTable”. In Alexander Refsum Jensenius and Michael J. Lyons, editors, A NIME Reader: Fifteen Years of New Interfaces for Musical Expression, Current Research in Systematic Musicology, pages 103--105. Springer International Publishing, Switzerland, January 2017. [ bib | DOI | preprint | .pdf ]
 
Charles Martin and Henry Gardner. A Percussion-Focussed Approach to Preserving Touch-Screen Improvisation. In David England, Thecla Schiphorst, and Nick Bryan-Kinns, editors, Curating the Digital: Spaces for Art and Interaction, Springer Series on Cultural Computing, pages 51--72. Springer International Publishing, Switzerland, July 2016. [ bib | DOI | preprint | .pdf ]

Refereed Journal Articles

Charles P. Martin. Percussionist-Centred Design for Touchscreen Digital Musical Instruments.Contemporary Music Review, 36(1--2):64--85, September 2017. [ bib | DOI | preprint ]

Refereed Conference Proceedings

Tønnes F. Nygaard, Charles P. Martin, Jim Torresen, and Kyrre Glette. Self-Modifying Morphology Experiments with DyRET: Dynamic Robot for Embodied Testing. In Proc. of the IEEE Int. Conf. on Robotics & Automation (ICRA), May 2019. To appear at ICRA '19. [ bib | arXiv | video | preprint | http ]
 
Benedikte Wallace and Charles Patrick Martin. Comparing Models for Harmony Prediction in an Interactive Audio Looper. In International Conference on Computational Intelligence in Music, Sound, Art and Design, April 2019. To appear at EvoMUSART '19. [ bib | preprint ]
 
Tønnes Frostad Nygaard, Charles P. Martin, Jim Torresen, and Kyrre Glette. Evolving Robots on Easy Mode: Towards a Variable Complexity Controller for Quadrupeds. In International Conference on the Applications of Evolutionary Computation, April 2019. To appear at EvoApplications '19. [ bib | preprint ]
 
Charles P. Martin and Jim Torresen. Predictive Musical Interaction with MDRNNs. In NeurIPS 2018 Workshop on Machine Learning for Creativity and Design, Montréal, Canada, December 2018. [ bib | DOI | preprint | .pdf ]
 
Tønnes F. Nygaard, Charles P. Martin, Eivind Samuelsen, Jim Torresen, and Kyrre Glette. Real-World Evolution Adapts Robot Morphology and Control to Hardware Limitations. In Proceedings of the Genetic and Evolutionary Computation Conference, GECCO '18, pages 125--132, New York, NY, USA, July 2018. ACM. [ bib | DOI | video | preprint | .pdf ]
 
Victor Evaristo Gonzalez Sanchez, Agata Zelechowska, Charles P. Martin, Victoria Johnson, Kari Anne Vadstensvik Bjerkestrand, and Alexander Refsum Jensenius. Bela-Based Augmented Acoustic Guitars for Inverse Sonic Microinteraction. In Proceedings of the International Conference on New Interfaces for Musical Expression, NIME '18, June 2018. [ bib | DOI | preprint | .pdf ]
 
Charles P. Martin, Alexander Refsum Jensenius, and Jim Torresen. Composing an Ensemble Standstill Work for Myo and Bela. In Proceedings of the International Conference on New Interfaces for Musical Expression, NIME '18, pages 196--197, June 2018. [ bib | DOI | preprint | .pdf ]
 
Charles P. Martin and Jim Torresen. RoboJam: A Musical Mixture Density Network for Collaborative Touchscreen Interaction. In Antonios Liapis, Juan Jesús Romero Cardalda, and Anikó Ekárt, editors, Computational Intelligence in Music, Sound, Art and Design: International Conference, EvoMUSART, volume 10783 of Lecture Notes in Computer Science, pages 161--176, Switzerland, April 2018. Springer International Publishing. [ bib | DOI | arXiv | video | http ]
 
Charles P. Martin, Kai Olav Ellefsen, and Jim Torresen. Deep Models for Ensemble Touch-Screen Improvisation. In Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences, AM '17, August 2017. [ bib | DOI | preprint ]
 
Charles P. Martin and Jim Torresen. Exploring Social Mobile Music with Tiny Touch-Screen Performances. In Tapio Lokki, Jukka Pätynen, and Vesa Välimäki, editors, Proceedings of the 14th Sound and Music Computing Conference, SMC '17, pages 175--180, Espoo, Finland, July 2017. Aalto University. [ bib | DOI | video | preprint | http ]
 
Charles P. Martin and Jim Torresen. MicroJam: An App for Sharing Tiny Touch-Screen Performances. In Proceedings of the International Conference on New Interfaces for Musical Expression, NIME '17, pages 495--496, Denmark, May 2017. Aalborg University Copenhagen. [ bib | DOI | video | preprint | http ]
 
Charles Martin, Henry Gardner, Ben Swift, and Michael Martin. Intelligent Agents and Networked Buttons Improve Free-Improvised Ensemble Music-Making on Touch-Screens. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '16, pages 2295--2306, New York, NY, USA, May 2016. ACM. [ bib | DOI | video | preprint ]
 
Charles Martin and Henry Gardner. Free-Improvised Rehearsal-as-Research for Musical HCI. In Proceedings of the CHI Musical HCI Workshop, May 2016. [ bib | DOI | .pdf ]
 
Charles Martin and Henry Gardner. Can Machine Learning Apply to Musical Ensembles? In Proceedings of the CHI Human-Centered Machine Learning Workshop, May 2016. [ bib | DOI | .pdf ]
 
Charles Martin, Henry Gardner, Ben Swift, and Michael Martin. Music of 18 Performances: Evaluating Apps and Agents with Free Improvisation. In Jon Drummond, Donna Hewitt, Sophea Lerner, and Ian Stevenson, editors, Proceedings of the 2015 Conference of the Australasian Computer Music Association, ACMC2015 - MAKE!, pages 85--94. Australasian Computer Music Association, November 2015. [ bib | preprint | http ]
 
Charles Martin, Henry Gardner, and Ben Swift. Tracking Ensemble Performance on Touch-Screens with Gesture Classification and Transition Matrices. In Edgar Berdahl and Jesse Allison, editors, Proceedings of the International Conference on New Interfaces for Musical Expression, pages 359--364, Baton Rouge, Louisiana, USA, May 2015. Louisiana State University. [ bib | DOI | preprint | .pdf ]
 
Charles Martin and Henry Gardner. That Syncing Feeling: Networked Strategies for Enabling Ensemble Creativity in iPad Musicians. In Proceedings of CreateWorld, Brisbane, Australia, February 2015. Griffith University. [ bib | preprint | http ]
 
Charles Martin, Henry Gardner, and Ben Swift. MetaTravels and MetaLonsdale: iPad Apps for Percussive Improvisation. In CHI '14 Extended Abstracts on Human Factors in Computing Systems, CHI EA '14, pages 547--550, New York, NY, USA, April 2014. ACM. [ bib | DOI | preprint ]
 
Charles Martin, Henry Gardner, and Ben Swift. Exploring Percussive Gesture on iPads with Ensemble Metatone. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '14, pages 1025--1028, New York, NY, USA, April 2014. ACM. [ bib | DOI | video | preprint | http ]
 
Charles Martin and Henry Gardner. Preserving Musical Performance on Touch-Screens. In Proceedings of the CHI 2014 Workshop on Curating the Digital: Spaces for Art and Interaction, Toronto, Canada, April 2014. [ bib | DOI | preprint | .pdf ]
 
Charles Martin. Integrating Mobile Music with Percussion Performance Practice. In Proceedings of the International Computer Music Conference, pages 437--440, Perth, Australia, August 2013. [ bib | preprint | http ]
 
Charles Martin. Performing with a Mobile Computer System for Vibraphone. In W. Yeo, K. Lee, A. Sigman, Ji H., and G. Wakefield, editors, Proceedings of the International Conference on New Interfaces for Musical Expression, pages 377--380, Daejeon, Republic of Korea, May 2013. Graduate School of Culture Technology, KAIST. [ bib | DOI | preprint | .pdf ]
 
Charles Martin and Chi-Hsia Lai. Strike on Stage: a Percussion and Media Performance. In Alexander R. Jensenius, Anders Tveit, Rolf I. Godoy, and Dan Overholt, editors, Proceedings of the International Conference on New Interfaces for Musical Expression, pages 142--143, Oslo, Norway, May 2011. [ bib | DOI | video | preprint | .pdf ]
 
Charles Martin, Benjamin Forster, and Hanna Cormick. Cross-Artform Performance Using Networked Interfaces: Last Man to Die's Vital LMTD. In Kirsty Beilharz, Bert Bongers, Andrew Johnston, and Sam Ferguson, editors, Proceedings of the International Conference on New Interfaces for Musical Expression, pages 204--207, Sydney, Australia, June 2010. [ bib | DOI | video | preprint | .pdf ]

Teaching

I currently teach:

I used to teach:

Service

Media

Updated:  1 June 2019/Responsible Officer:  Head of School/Page Contact:  CECS Marketing