Milk Jason
Joe Basile, Eurorack
Alex Smith, Percussion and Tech
Milk Jason is a guided improvisation for drums, drum triggers, contact microphones, and Eurorack. The drum triggers and contact microphones are used on the drums to trigger specific parameters in Ableton and on the Eurorack; adjustments are made in real time.
Joe Basile is the owner of The Chicken, a music and sound design company. Since 2009 he’s worked as an in-house composer, field audio engineer, and freelance music composer. He’s created countless soundtracks for the world’s biggest brands. Joe’s approach is a merge of the acoustic and the electronic; literal sounds and metaphorical sounds. He has a studio full of modern and vintage synthesizers, guitars, wind instruments, and percussion.
Alex Smith is a musician, scholar, and technologist. He is the Associate Professor of Percussion at University of Central Missouri. Smith released his first album, Determined Volumes, with pfMENTUM in October of 2020. His second album in collaboration with Joe Basile, TrnslcntTkn, was released with Mother Brain Records in May 2022. His Western classical percussion quartet Without Borders will release their first album in March 2023.
//run with caution
for live electronics
Victor Zheng, composer/performer
Sometimes you create something horrific on SuperCollider that you can’t resist making use of.
Victor Zheng was born in Beijing, China and raised in Portland, Oregon. He previously studied at the Oberlin Conservatory (BM ’16) and University of Massachusetts Amherst (MM ’18), and is currently pursuing a DMA in music composition at the University of Illinois Urbana-Champaign.
Victor’s notable performances have included collaborations with the Opus One Chamber Orchestra, TaiHei Ensemble, Composers of Oregon Chamber Orchestra, New Music Mosaic, and Illinois Modern Ensemble. He has had his music and research featured at events such as the highSCORE festival, Valencia International Performance Academy, Electronic Music Midwest, SEAMUS, and the SCI National Conference. His work has also been featured in publications including Oregon Arts Watch, Willamette Week, and Art on My Sleeve.
Victor taught music theory and aural skills as a graduate teaching assistant while at UMass, and subsequently taught at the Ethos Music Center in Portland, Oregon, and at the Shedd Institute for the Arts in Eugene, Oregon. He is currently a teaching assistant in music theory and musicianship at the University of Illinois.
SHP of THSEUS
RE_SHFT_ER
SHP of THSEUS is a collaborative audiovisual composition that draws inspiration from the Greek legend of the “Ship of Theseus.” As with the ship in the original scenario, various components and settings of our instruments are altered or replaced throughout the duration of the performance, changing instrument characteristics and the roles of the individual performers. We navigate this question of identity through a graphical score and custom instruments that expose parameters to the entire ensemble through the use of a shared network control structure (Collab-Hub).
SHP of THSEUS is a piece for live performance using custom digital sound and video instruments that expose one or more parameters to a shared network control structure — meaning, each performer exerts some influence over other ensemble member’s instruments while also acting as the warden of their own. Many of the instruments feature hybrid setups combining physical or analog components (e.g. guitar, synths, lighting) with digital processing and manipulation.
Jeff Herriott is a composer whose music focuses on sounds that gently shift and bend at the edges of perception, featuring interaction between live performers and electronic sounds. He is a Professor of Music at the University of Wisconsin at Whitewater, where he is Interim Chair of the Music Department, Coordinator of the Media Arts and Game Development Program, and teaches courses in music composition, audio, multimedia, and music technology.
Nick Hwang is a composer and sonic artist whose work explores connections in art, technology and interaction. He is currently an Associate Professor at the University of Wisconsin at Whitewater in the Media Arts and Game Development program. His research projects include novel musical controllers, and networked musical communication.
Anthony T. Marasco is an Assistant Professor of Music Technology and Composition at the University of Texas Rio Grande Valley. He is the director of the UTRGV New Music Ensemble. His research focuses on web audio, hardware hacking, and creating hardware and software tools for networked music performance practices.
Eric Sheffield is a Visiting Assistant Professor in the Music and Emerging Technology in Business & Design departments at Miami University. He is the director of the Miami University Laptop Ensemble, aka MULE. Eric’s research and creative interests include physics-based modeling, networked performance, and popular music.
Anna Weisling is a practice-based researcher whose interests include music technology, interaction design, and real-time visual performance. She is currently an Assistant Professor in the Emerging Technology in Business & Design department at Miami University Oxford, where she teaches courses in digital art and design, critical making, and interactive devices.
Hadrosaur Variations
Courtney Brown, soprano, hadrosaur skull instrument, computer
Hadrosaur Variations is a work for hadrosaur skull instrument (Rawr!), soprano, and laptop. A Corythosaurus is a duck-billed dinosaur, a lambeosaurine hadrosaur that scientists hypothesize used its large head crest for sound resonation. Rawr! is a musical instrument created from a replica of a subadult Corythosaurus skull and nasal passages. Musicians give voice to this instrument by blowing into a mouthpiece, exciting a larynx mechanism and resonating the sound through the dinosaur’s nasal cavities and skull. The CT scans of the subadult skull fossil were provided by Witmer Labs, Ohio University and scientific research guided larynx creation. Rawr! was created by myself, Courtney Brown and Sharif Razzaque. We also acknowledge Garth Paine, Carlo Sammarco, Sallye Coyle, Brent Brimhall and Gordon Bergfors for their contributions and funding from Arizona State University GPSA.
In Hadrosaur Variations II, I mimic the dinosaur with the soprano voice and vice versa. I became interested in coaxing melodies from the hadrosaur skull instrument because this was a challenging exercise. I create a hadrosaur call within the hypothesized Corythosaurus vocal range to begin. Then, I explore the instrument as a sound and respond with voice. Hadrosaur and human interplay and build atop one another.
Courtney Brown is a musician, software developer, and Argentine tango dancer. She invents new musical instruments in which the act of creating sound is transformative in some way. Her work has been featured across the globe, including Ars Electronica (Austria), National Public Radio (NPR), Diapason Gallery (Brooklyn), CICA Museum (South Korea), New Interfaces for Musical Expression (London), International Computer Music Conference (Santiago), and the Telfair Museum (United States). She is the recipient of two Fulbright Awards. She received a 2014 Student Award for interactive Argentine tango in Buenos Aires, and she is currently the 2022-23 Fulbright Canada Research Chair in Arts and Humanities continuing her research on dinosaur vocalization in Alberta, Canada. She is an Assistant Professor at the Center of Creative Computation, Southern Methodist University. She received her D.M.A from Arizona State University and her M.A. in Electroacoustic Music from Dartmouth College.
Adlez
for guitar and fixed media
Wenbin Lyu, composer
Noah Ward, guitar
Adlez was composed in the fall of 2020 during the lockdown. I got the idea for this piece from a video game poster that was behind my work desk “The Legend of Zelda”. Whenever I had a Zoom meeting and observed myself on my laptop, the poster would show “Adlez” instead of “Zelda”, and that’s how I randomly titled it.
While writing this piece, I imagined myself adventuring in this open-world game a lot. As a result, the piece is a perpetual motion with various colorful timbres. This work is also my (first exploration and practice of the music programming language RTcmix, which I used to generate and process sound materials.
Wenbin Lyu is a Chinese composer and guitarist based in the United States, whose compositions blend contemporary Western techniques with ancient oriental culture. His work draws inspiration from nature, science, and video games.
Lyu has received fellowships from institutions and festivals such as the Tanglewood Music Center, Cabrillo Festival Composers Workshop, Blackbird Creative Lab, and Atlantic Center for the Arts. His works have been featured at numerous events, including the SCI, RED NOTE, TUTTI, Alba, Cabrillo, NMG, Tanglewood, ICMC, NYCEMF, EMM, IRCAM, SEAMUS, and SPLICE. He is the recipient of one ASCAP Young Composer Award and three The American Prize awards.
Lyu holds degrees from the China Conservatory, New England Conservatory, and Cincinnati College-Conservatory.
Live Fieldwork: Astromusicology with Neville, the Maxinean
Ritwik Banerji, saxophone
Neville, metal percussion
This is an astromusicological encounter with Neville, the Maxinean. What you see and hear is a live interaction taking place between a terrestrial control station and a remote exploration vehicle in M9, a distant region of the universe ruled by Maxine, who has rewritten the laws of physics such that all sound was always already motion.
Today, we observe the astromusicologist attempting to understand the sonic spiritual practices of Neville, a Maxinean percussionist. Like many Maxineans, Neville let’s the sound’s ghost move his body first. He strikes only if the ghost moves him to strike, but each strike brings a new set of spirits to guide his body.
We also observe the astromusicologist as the inheritor of the idiocy of his field of research. While astromusicologists have made blunders of their field research to understand music cultures on planets such as India, Japan, Africa, and Iran, they have yet to fully broadcast this idiocy through astromusicological participant-observation deeper in the universe. As the child of naturalized aliens from India, our astromusicologist is uniquely capable of continuing the tradition of astromusicological bungles.
Ritwik Banerji is an interactive media artist, experimental ethnographer, and saxophonist. He is currently Assistant Professor of Anthropology at Iowa State University, where he also teaches human computer interaction.
Neville is a Maxinean, a “musician” and “dancer” according to earthling astromusicologists. He has never seen a human being, but he has (had to) hear Banerji many times, which is an experience that leaves him greatly ambivalent about any further encounters with other humans in the future.
32 Sensors or more
Karl F. Gerber, Sensor Array/Performer
Touchless control of synthesis: Semi-Conductor. The new sensor array allows the control of numerous parameters simultaneously in real time (polyphonic). This seems to me essential for improvisation. Hands or legs and the upper body can be used. IR distance sensors based on the triangulation principle generate analogue voltages that are transmitted to MIDI continuous controllers. The array features 32 up to 64 controllers. Most of them feed algorithms. For the player’s orientation, LED bars with 10 to 20 display levels are placed close to the sensor.
In the first phase, Re-Synthesis from NI Reaktor and Pianoteq modeling was used for the sonification.
So we are looking at a kind of half-a-conductor (with the hardware made of semiconductors). Gesture recognition or AI are not planned, I prefer direct cause and effect access. I want to learn myself. Of course, a lot of configuration work has to be done for each composition. And rehearsal effort on the part of the performer.
I do not start from the paradigm of universal gestures. Rather, the playing (ad hoc composing) of a complex instrument is imitated: operating an organ with hands and feet produces gestures as a side effect, but they always depend on the purpose of the sound production and the construction of the console. In this respect, a traditional approach that relates to highly developed performance technique. The benefit of my system is that it makes numerous parameters simultaneously available to computer algorithms of all kinds. Gestures also arise, of course, which is composed visualization of sound.
Composer Karl F. Gerber began playing the electric bass auto didactically. In 1975, he attended musicology lectures with Riethmüller in Freiburg as a guest student. After turning to jazz, he studied double bass with Adelhard Roidinger in Munich. He has a M Sc. in physics from the LMU Munich.
As a composer he is self-taught, but attended courses with H. W. Erdmann, Cort Lippe, Robert
Rowe, Carola Bauckholt, Götz Tangerding, Alex Grünwald, Joe Haider and Joe Viera.
He has performed live algorithmic performances, including a co-improvisation with the University of Michigan Dancers at the 1998 ICMC in Ann Arbor, Michigan. This featured live formula editing, an anticipation of live coding.
“Beautiful Numbers” was awarded the electronic “Music for Dance” award at Bourges.
Since “Loops” for solo piano, he has also created works in traditional notation without electronics
such as “VC3e” for harpsichord four hands.
After an invitation to the 2017 Kontakte Festival at the AdK Berlin, his “computer music without
loudspeakers” has also attracted international interest. For example, in the Boston Berklee
and South Korea, Seoul 2019.
His installation “Violinautomat” was selected by the ISCM for the World Music Days in Tallinn, Estonia. The critic of Dagens Nyheter wrote “fascinating both technically and sonically”. He received the “Award of Distinction” at Matera Intermedia 2020 in Italy and the Best Music Award of the CMMR, Tokyo.
His current projects are an automaton for alto recorder, a bowed psaltery with 16 bows, an extended snare drum and a hammer zither.