Ideas That Matter: Use AI to Make Your Own Beat Saber Levels
About this Event
Are you curious about how AI and their underlying machine learning techniques work? Do you have a favorite song you'd like to "play" in a virtual reality rhythm video game like Beat Saber?
Inspired by Papers We Love, this second of two talks will attempt to explain in plain language the 2020 "DeepSaber: Deep Learning for high dimensional choreography" academic paper, where researchers expanded on the foundation of 2017's "Dance Dance Convolution" paper to build a new model that can generate Beat Saber levels based only on audio input.
We'll also discuss the ethical and economic implications of building an AI model able to reproduce (at slightly lower quality but near-zero cost) the labor of people whose game levels were used to train the paper's model. After the talk portion, attendees will be invited to submit their own songs to a popular hosted generative model to generate, inspect, then play Beat Saber levels on the speaker's gaming desktop computer and Playstation VR2 headset.
Presented by David Warden, Senior Systems Analyst & Managing Director of Research Technology in CIT.
Event Details
See Who Is Interested
0 people are interested in this event