Skip to content

Latest commit

 

History

History
7 lines (4 loc) · 1002 Bytes

README.md

File metadata and controls

7 lines (4 loc) · 1002 Bytes

#Automatic Rendering of Augmented Events in Immersive Concerts

Mentor(s): Zhiyao Duan (Electrical and Computer Engineering); Matthew Brown (Music Theory, Eastman School of Music)

In immersive concerts, the audience’s music listening experience is often augmented with texts, images, lighting and sound effects, and other materials. Manual synchronization of these materials with the music performance in real time becomes more and more challenging as their number increases. In this project, we will design an automatic system that is able to follow the performance and control pre-coded augmented events in real time. This allows immersive concert experiences to scale with the complexity of the texts, images, lighting and sound effects. We will work with TableTopOpera at the Eastman School of Music on implementing and refining this system.

Funding for this research is provided by the National Science Foundation, award no.1659250.