Music Department

Electronic Music Research Group

The Electronic Music Research Group is a usability and implementation research and production group whose primary focus is the realization of tools for live music and theater performance by updating and continuing traditional computer and electroacoustic research methods.

Projects

Currently our main platform is Python, and we are programming a Raspberry Pi attached to sensors as a component of an art installation.  Other possibilities include individual performance works which could lead to the creation of a larger multimedia music/theater hybrid works. A relatively simple example of this type of work for a solo performer, a starting point, is Dr. Gran’s Sensing Angels which, among other things, uses Csound to detect pitches from a clarinet so that the performer can self-harmonize using a foot pedal which controls chord choices. The chords are created by resynthesizing tones based on computer analysis of the current clarinet tones to be harmonized, all in real-time.

Student Researchers

Researchers in the Group focus on the implementation of specific aspects of individual usability goals and the creation of tools. These technological foundations of this work arise from recent and current work in a variety of fields including signal-processing, sound design for video games, television and films. Students would learn the software and work on the implementations of their unit. Work will be funded by scholarship or work-study funds. Students who do not have either funding sources of these but who still wish to participate may pursue funding through the Office of Student Research, The Department of Music or simply sign-up for (unremunerated) course-work.

The group is organized into three units:

Unit 1: Sound Sensing, Tracking and Re-synthesis.

This unit of two students works on technologies that sense and analyze live sounds and their qualities and create routines such as self-harmonizing instruments and musical expressions that are triggered by certain pitches, rhythms and timbres. It could later include work in projects like the linking of musical events to the GPS triangulation of the performer.

Unit 2: Instrument Creation

This unit of 2 students creates computer instruments that use and combine current electronic synthesis and sampling technologies such as frequency modulation, granular synthesis, and wave shaping into instruments controlled in both traditional as well as more innovative means. Later projects could include combining audio with video using software such as Gem or Processing.

Unit 3: Interface Creation

This unit of 2 students works on the scripting and managements of musical events in the shaping of larger narratives implementing the larger structures. Finding the best tools for the project. It could model its work on projects such as Qlab and the work of Steim. Structured around MIDI, the de facto communication protocol of music (the Ethernet implementation of this is RTP-MIDI), this unit would create logical communication platforms for the hardware and software.

About the Director

Dr. Charles Gran has a Ph.D. in Music Composition from the University of California, Los Angeles (2004). He joined the Music Department at Truman State University in 2008 and is an Associate Professor of Theory and Composition. His primary areas of research is creative work in Music composition, electronic music and music production, and theater.