In the late 1800's a revolution started. Photography allowed the capture and study of human and animal motion. At the same time electrical signals were recorded for the first time from the surfaces of living brains. Today modern computer vision and neuroscience are coming together to reveal clues as to how the brain controls the complex movements of our bodies.
We seek insight about how the brain controls natural behavior in natural environments. To that end we study
- motion capture of natural behavior
- modeling motor cortical activity during natural behavior
- developing new neural decoding algorithms
- applying our models to brain machine interfaces with implanted electrocortical arrays
We are just at the beginning of our ability to capture animal motion. Our current work is focused on learning models of animal shape and motion that are similar in quality to our models of humans. This is a challenging task.
Unlike humans, it will be impossible to capture 3D scans of thousands of animals of varying shape in a wide variety of poses. To build such models we are using heterogenous sources of data of varying quality and are developing new models and algorithms to learn high quality 3D models from this data. While this presents many challenges, the ability to track animals (and groups of animals) in natural settings would revolutionize many fields of biology.
We also study how optical flow could be coded in, and computed by, brains.