Friday – March 15, 2024 – Workshops: 10:30am-12:30pm – Studios, Wood Hall (basement level)

  • Nick Hwang and Anthony T. Marasco: MoNoDec—The Mobile Node Controller Platform
  • Mir Jeffres: Accessible Brain Music in Max/MSP

Nick Hwang and Anthony T. Marasco – MoNoDec—The Mobile Node Controller Platform

MoNoDeC is a multichannel audio system that uses audience members’ mobile phones and IoT-hardware-driven speakers as point sources for configurable and dynamic immersive audio speakers and audience interface. Audience participants register their current location within a customizable audience space (rows or cloud or freeform) on their mobile phones. Their mobile phones become a point source within the immersive experience (performance or installation). MoNoDec can be used in conjunction with ‘traditional’ loudspeaker arrays to enhance immersive audio experiences.

Built off Collab-Hub.io, additional interfaces would allow audience members to interact beyond offering their mobiles — interfaces may offer live-action or meta-performance decision-making (voting) that affect the experience in various ways, such as changing the musical form, drawing on a collective canvas, or changing their localized instrument timbre.
A performance/installation controller sends audio, control, and interface data to participants throughout the experience.

The recent additions to Monodec are 1. connecting to RNBO-built instruments and 2. Raspberry Pi-driven IoT instruments (which may run RNBO or other programmed instruments).

Nick Hwang is an interdisciplinary artist who explores connections in art, technology, and interaction, focusing on collaborative multi-user interactive experiences. His practice incorporates programming, audio and music, and design (graphic, interaction, interface, experiential). His works are audiovisual, gestural, explorative, immersive, and collaborative. He has published, shown, exhibited, and performed his research and creative work at international conferences such as the International Computer Music Conference, International Society for Electronic Art (ISEA), ACM SIGGRAPH, Web Audio Conference, Society for ElectroAcoustic Music in the United States, Ars Electronica, NowNet Arts, and the Edinburgh Fringe Festival. He is an Associate Professor at the University of Wisconsin-Whitewater. Nick holds a PhD in Music Composition with a focus on Experimental Music and Digital Music, a Master’s degree in Music Composition from Louisiana State University, and a B.A. in Theory and Composition from the University of Florida.

Mir Jeffres – Accessible Brain Music in Max/MSP

I created a Max external library to make it easier to create music with biosignals such as EEG and EKG, by interfacing LSL (Lab Streaming Layer) and the Max SDK and creating basic real-time processing and upsampling objects that are idiomatic to Max and therefore accessible to this generation’s electronic music performers. For example, you could convolve audio samples with brain waves to create a new brain-controlled timbre for a meditative ambient piece of music, or you could set a sample trigger every time the 10 Hz component of the brain signal increases to create a brain-controlled drum machine.
In this workshop, I will be teaching attendees the basics of biosignal interfaces and how to use them with this library to create a brain-controlled Max patch of their choosing.

Mir Jeffres is a saxophonist, experimental electronic musician, and a master’s student in Music Technology at Georgia Tech. A leader in the Atlanta music community, Mir teaches live sound in various student groups and is involved in several local bands including Two Factor Authentication, Ugly Joy, and Lash. This is Mir’s fifth and final year in the GT Music Technology program after being involved with research in interactive performance systems as part of the Brain Music Lab, Robotic Musicianship Lab, and now the IMPACT Lab with Dr. Alexandria Smith. Mir’s current research focus is on new interfaces for live music performance, specifically making it easier to use biosignals like EEG to support acoustic/electronic performance.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.