I am a (relatively new) assistant professor of Computer Science at Cornell University. I specialize in computer graphics, computer vision, human-computer interaction (HCI), as well as a random sampling of other topics.
The Before Times:
I completed my PhD at MIT in 2016, where I was advised by Fredo Durand, then worked as a postdoctoral researcher at Stanford University with Maneesh Agrawala until 2019, followed by a 1-year postdoc at Cornell Tech in NYC. I am now an assistant professor of computer science at Cornell University in Ithaca, NY.
Current office hours are Tuesdays at 5:05pm-6:05pm in Gates 307, or by appointment.
Interested in joining the group?
More info can be found on my Joining The Group page.
Update SIGGRAPH 2021: A Mathematical Foundation for Foundation Paper Pieceable Quilts
Our work on the geometry of paper piecing quilt designs, with first author Mackenzie Leake was published at SIGGRAPH 2021!
Update CVPR 2020 Oral: Visual Chirality Nominated for Best Paper at CVPR2020
Update ECCV 2020 Oral: Crowdsampling the Plenoptic Function
Select Research Highlight Videos
I work on a range of topics in graphics, vision, and HCI, with most of my research focusing on how to apply work in these fields to new problems and application spaces. Below you will find a sampling of videos that describe different projects I've worked on. For a longer list of my publications, see my publications page or CV. My TED 2015 talk is also a good introduction to my work on visual vibration analysis.
Click on the thumbnails below to see videos summarizing different research projects. More can be found on my publications page.
Speaking at TED 2015
- Check out the Wired article discussing some of our work on computational video editing.
- I will be serving on the papers committee for SIGGRAPH Asia 2019.
- Check out our online demo of results from our SIGGRAPH 2018 paper on Visual Rhythm and Beat.
- Code for the project can be found on my Github page.
Visual Rhythm & Beat Results
Abstract:We present a visual analogue for musical rhythm derived from an analysis of motion in video, and show that alignment of visual rhythm with its musical counterpart results in the appearance of dance. Central to our work is the concept of visual beats — patterns of motion that can be shifted in time to control visual rhythm. By warping visual beats into alignment with musical beats, we can create or manipulate the appearance of dance in video. Using this approach we demonstrate a variety of retargeting applications that control musical synchronization of audio and video: we can change what song performers are dancing to, warp irregular motion into alignment with music so that it appears to be dancing, or search collections of video for moments of accidentally dance-like motion that can be used to synthesize musical performances.
This example was a lot of fun to create. I found this Youtube channel for Dancing Nathan – a dog that does this strange flailing trick on a chair, which kind of looks like off-beat dancing.