“FUTURE PHASES” showcases latest frontiers in music technology and interactive performance | MIT News

Music technology took center stage at MIT during “FUTURE PHASES,” a night of works for string orchestra and electronics, presented by the MIT Music Technology and Computation Graduate Program as a part of the 2025 International Computer Music Conference (ICMC). 

The well-attended event was held last month within the Thomas Tull Concert Hall throughout the latest Edward and Joyce Linde Music Constructing. Produced in collaboration with the MIT Media Lab’s Opera of the Future Group and Boston’s self-conducted chamber orchestra A Far Cry, “FUTURE PHASES” was the primary event to be presented by the MIT Music Technology and Computation Graduate Program in MIT Music’s latest space.

“FUTURE PHASES” offerings included two latest works by MIT composers: the world premiere of “EV6,” by MIT Music’s Kenan Sahin Distinguished Professor Evan Ziporyn and professor of the practice Eran Egozy; and the U.S. premiere of “FLOW Symphony,” by the MIT Media Lab’s Muriel R. Cooper Professor of Music and Media Tod Machover. Three additional works were chosen by a jury from an open call for works: “The Wind Will Carry Us Away,” by Ali Balighi; “A Blank Page,” by Celeste Betancur Gutiérrez and Luna Valentin; and “Coastal Portrait: Cycles and Thresholds,” by Peter Lane. Each work was performed by Boston’s own multi-Grammy-nominated string orchestra, A Far Cry.

“The ICMC is all about presenting the most recent research, compositions, and performances in electronic music,” says Egozy, director of the brand new Music Technology and Computation Graduate Program at MIT. When approached to be a component of this 12 months’s conference, “it seemed the proper opportunity to showcase MIT’s commitment to music technology, and particularly the exciting latest areas being developed straight away: a brand new master’s program in music technology and computation, the brand new Edward and Joyce Linde Music Constructing with its enhanced music technology facilities, and latest faculty arriving at MIT with joint appointments between MIT Music and Theater Arts (MTA) and the Department of Electrical Engineering and Computer Science (EECS).” These recently hired professors include Anna Huang, a keynote speaker for the conference and creator of the machine learning model Coconet that powered Google’s first AI Doodle, the Bach Doodle.

Egozy emphasizes the distinctiveness of this occasion: “You have got to grasp that this can be a very special situation. Having a full 18-member string orchestra [A Far Cry] perform latest works that include electronics doesn’t occur fairly often. Usually, ICMC performances consist either entirely of electronics and computer-generated music, or perhaps a small ensemble of two-to-four musicians. So the chance we could present to the larger community of music technology was particularly exciting.”

To benefit from this exciting opportunity, an open call was put out internationally to pick out the opposite pieces that may accompany Ziporyn and Egozy’s “EV6” and Machover’s “FLOW Symphony.” Three pieces were chosen from a complete of 46 entries to be a component of the evening’s program by a panel of judges that included Egozy, Machover, and other distinguished composers and technologists.

“We received an enormous number of works from this call,” says Egozy. “We saw all types of musical styles and ways in which electronics could be used. No two pieces were very just like one another, and I believe due to that, our audience got a way of how varied and interesting a concert will be for this format. A Far Cry was really the unifying presence. They played all pieces with great passion and nuance. They’ve a way of really drawing audiences into the music. And, after all, with the Thomas Tull Concert Hall being within the round, the audience felt much more connected to the music.”

Egozy continues, “we took advantage of the technology built into the Thomas Tull Concert Hall, which has 24 built-in speakers for surround sound allowing us to broadcast unique, amplified sound to each seat in the home. Chances are high that every body may need experienced the sound barely in another way, but there was all the time some sense of a multidimensional evolution of sound and music because the pieces unfolded.”

The five works of the evening employed a variety of technological components that included playing synthesized, prerecorded, or electronically manipulated sounds; attaching microphones to instruments to be used in real-time signal processing algorithms; broadcasting custom-generated musical notation to the musicians; utilizing generative AI to process live sound and play it back in interesting and unpredictable ways; and audience participation, where spectators use their cellphones as musical instruments to turn out to be a component of the ensemble.

Ziporyn and Egozy’s piece, “EV6,” took particular advantage of this last innovation: “Evan and I had previously collaborated on a system called Tutti, which implies ‘together’ in Italian. Tutti gives an audience the power to make use of their smartphones as musical instruments in order that we will all play together.” Egozy developed the technology, which was first utilized in the MIT Campaign for a Higher World in 2017. The unique application involved a three-minute piece for cellphones only. “But for this concert,” Egozy explains, “Evan had the concept that we could use the identical technology to write down a brand new piece — this time, for audience phones and a live string orchestra as well.”

To clarify the piece’s title, Ziporyn says, “I drive an EV6; it’s my first electric automotive, and once I first got it, it felt like I used to be driving an iPhone. But after all it’s still only a automotive: it’s got wheels and an engine, and it gets me from one place to a different. It appeared like a very good metaphor for this piece, wherein numerous the sound is literally played on cellphones, but still has to work like all other piece of music. It’s also a little bit of an homage to David Bowie’s song ‘TVC 15,’ which is about falling in love with a robot.”

Egozy adds, “We wanted audience members to feel what it’s prefer to play together in an orchestra. Through this technology, each audience member becomes a component of an orchestral section (winds, brass, strings, etc.). As they play together, they’ll hear their whole section playing similar music while also hearing other sections in several parts of the hall play different music. This enables an audience to feel a responsibility to their section, hear how music can move between different sections of an orchestra, and experience the fun of live performance. In ‘EV6,’ this experience was much more electrifying because everyone within the audience got to play with a live string orchestra — perhaps for the primary time in recorded history.”

After the concert, guests were treated to 6 music technology demonstrations that showcased the research of undergraduate and graduate students from each the MIT Music program and the MIT Media Lab. These included a gamified interface for harnessing just intonation systems (Antonis Christou); insights from a human-AI co-created concert (Lancelot Blanchard and Perry Naseck); a system for analyzing piano playing data across campus (Ayyub Abdulrezak ’24, MEng ’25); capturing music features from audio using latent frequency-masked autoencoders (Mason Wang); a tool that turns any surface right into a drum machine (Matthew Caren ’25); and a play-along interface for learning traditional Senegalese rhythms (Mariano Salcedo ’25). This last example led to the creation of Senegroove, a drumming-based application specifically designed for an upcoming edX online course taught by ethnomusicologist and MIT associate professor in music Patricia Tang, and world-renowned Senegalese drummer and MIT lecturer in music Lamine Touré, who provided performance videos of the foundational rhythms utilized in the system.

Ultimately, Egozy muses, “’FUTURE PHASES’ showed how having the correct space — on this case, the brand new Edward and Joyce Linde Music Constructing — really generally is a driving force for brand spanking new ways of considering, latest projects, and latest ways of collaborating. My hope is that everybody within the MIT community, the Boston area, and beyond soon discovers what a really amazing place and space we have now built, and are still constructing here, for music and music technology at MIT.”

Related Post

Leave a Reply