Artist-choreographer Kate Ladenheim falls repeatedly to various pop songs and you, the audience member, rate her level of commitment to falling.
Haunting and humorous at once, Ladenheim takes your feedback into account as she attempts to improve her falling, making herself into a playable avatar. All the while, her movements in each fall are monitored by motion capture and projected as digital avatars accumulating in piles behind her. By ceding some control of Kate’s body to you, mediating that control through various machines and programs, and punting an interpretive task back out to you in a reinforcing feedback loop, this performance asks who is more vulnerable, the judge or the judged? Who is in charge of the body in a technologically mediated world?
MAXMachina Artist Profile
Kate Ladenheim is a choreographer, media designer, and creative technologist. Her work spans interactive installations, media design, performance and robotics. She researches bodies in motion and how they impact and are impacted by systems of social and technological pressure. Ladenheim holds an M.F.A. in Media Design Practices from ArtCenter College of Design. She has conducted research in motion interfaces for robotics design at U.C.L.A., and was the 2019-2020 Artist in Residence at the Robotics, Automation, & Dance Lab at the University of Illinois at Champaign-Urbana. Her artistic projects have been presented internationally, including at The Invisible Dog, National Sawdust, GrizzlyGrizzly, Brown University, Joe’s Pub at The Public Theater, The Edinburgh Fringe Festival, and The Performance Arcade (New Zealand). Her work has been celebrated in Dance Magazine as one of their “25 to Watch” and “Best of 2018.”
Concept, Choreography, Media Design, Sensor Fabrication, & Performance Kate Ladenheim
Creative Technologists Mollye Bendell Media Design & Programming
Timothy Kelly Projection Mapping & Programming
Tech & Materials Computer running Windows Projection, Rokoko Motion Capture: SmartSuit Pro and Studio Software
TECH AND MATERIALS (cont.)
Unity, running Unity Engine, Oculus LipSync, Rokoko Studio, and Custom C# Programming, Console C# Application using Google Sheets API and Google
Cloud Text-to-Speech, Isadora, Arduino IDE & Arduino Uno, Microcontroller powering a custom pressure-sensitive mat, Processing Spout, Custom Node.js app for audience interaction, Realillusion Character Creator, & Autodesk Maya