|Using Machine Learning to Create Musical Instruments and Expressive Interactions|
|Rebecca Fiebrink, Goldsmiths College, University of London|
|Sponsored by CCT REU Program|
|Digital Media Center Theatre
July 26, 2016 - 04:30 pm
Machine learning is now widely recognized as a powerful tool for analyzing “big data,” and for supporting decision making in fields from finance to medicine. But in this talk, I’ll describe how machine learning can be used creatively—by musicians, artists, gamers, and makers who want to design new forms of interacting with computers and expressing themselves through digital technology. I’ll describe—and demo—how machine learning can be used to build new musical instruments, to build new controllers for people with disabilities, and to make it possible for kids and non-programmers to build complex software systems quickly.
Dr. Rebecca Fiebrink's research focuses on designing new ways for humans to interact with computers in creative practice. She is the developer of the Wekinator software for interactive machine learning, and she recently taught the world’s first MOOC (massively open online course) on Machine Learning for Musicians and Artists to over 3000 students. She has worked with companies including Microsoft Research, Sun Microsystems Research Labs, Imagine Research, and Smule, where she helped to build the #1 iTunes app "I am T-Pain." She holds a PhD in Computer Science from Princeton University. Prior to moving to Goldsmiths, she was an Assistant Professor at Princeton University, where she co-directed the Princeton Laptop Orchestra.
|Refreshments will be served.|