An Automated System to Measure Animal Body Part Dynamics

Web Published:
3/3/2020
Description:

SLEAP (Social LEAP Estimates Animal Poses):

An Automated System to Measure Multi-Animal Body Part Dynamics

 

 

Princeton Docket # 19-3510

 

        Analysis of lab animal movements is a key element in a variety of behavioral studies. The current standard for analyzing animal movement relies on manual video annotation, which is time consuming, mentally taxing, and error prone. Faced with these challenges, researchers at Princeton University have developed an AI-driven multi-animal pose estimation software called SLEAP (Social LEAP Estimates Animal Poses) for tracking individual body part kinematics of single or multiple interacting animals.

 

        SLEAP works with any animal or video by learning from as few as 10 user-provided examples of the locations of body parts of interest within the user’s own video frames. Providing examples to SLEAP is facilitated by a turnkey graphical user interface designed for practitioners without any previous technical expertise. The system learns from user feedback through interactive correction of predictions, greatly reducing the time and cost of generating data for training a production-quality machine learning model. SLEAP can be configured to learn to locate the animal(s) within each video frame, detect their individual body parts, associate them with the correct individual, and track them across frames. The output of SLEAP are complete trajectories of body part kinematics, akin to “motion capture” systems commonly used in film production, but without the need for specialized hardware, motion capture suits or body markers.

 

        This is a major improvement on other available software that provide generalized behavioral information, but lack full body position information. SLEAP generates accurate detailed descriptions of animal movements with the benefits of speed and facile operation. In addition, SLEAP’s flexible AI architecture allows for variation in imaging settings by learning from diverse examples. The program has been proven to accurately track body part movements of a diverse set of animals including fruit flies, bees, mice, tigers, giraffes and more. SLEAP’s inventors are seeking a partner to further develop the software and license to industry laboratories or other organizations.

 

 

Applications

-Tracking of individual body part kinematics of multiple animals

-High throughput analysis of animal behaviors (e.g., drug screens)

-Development of disease diagnostics for animal models

-Wildlife conservation and monitoring of animal well-being (e.g., agrotech

 

Advantages       

-Learns from few user examples

-Fast training of deep neural networks on consumer hardware

-Highly accurate and robust to variability in imaging conditions

-Single or multiple animal tracking

-No programming skills required

 

 

Intellectual Property & Development Status

 

Patent protection is pending.

 

Princeton is currently seeking commercial partners for the further development and commercialization of this opportunity.

 

 

Publications

 

Pereira, T. D., Ravindranath, S., Tabris, N., Li, J., Kislin, M., Wang, S. S. H., Murthy, M., & Shaevitz, J. W. (2020). SLEAP: Multi-animal pose tracking. In prep.

 

Pereira, T. D., Aldarondo, D. E., Willmore, L., Kislin, M., Wang, S. S. H., Murthy, M., & Shaevitz, J. W. (2019). Fast animal pose estimation using deep neural networks. Nature methods, 16(1), 117.

 

 

The Inventors

 

Joshua Shaevitz is a professor of Physics and Biophysics at the Lewis-Sigler Institute for Integrative Genomics, the Department of Molecular Biology, and the Princeton Neuroscience Institute at Princeton University. His research interests focus on the determinants of cell shape, formation of complex cellular patterns, and how behaviors are organized by the brain. He received a Ph.D. and M.S. in Physics from Stanford University and a B.A. in Physics from Columbia University. He is an HHMI Janelia Farm Visiting Scientist and has received numerous awards including the Presidential Early Career Award for Scientists and Engineers, the NSF CAREER Award, and the Human Frontier Science Program Young Investigator Award among others.

 

Mala Murthy is a professor of Molecular Biology and the Princeton Neuroscience Institute at Princeton University. Her research interests focus on how the brain converts sensory stimuli into meaningful representations and how they drive behavioral responses. Murthy received a Ph.D. in Neuroscience from Stanford University and an S.B. in Biology from Massachusetts Institute of Technology. She is an HHMI Faculty Scholar and has received numerous awards including the NSF BRAIN Initiative Award, the Princeton Dean’s Innovation Fund for New Ideas, and the NIH Innovator Award among others.

 

Talmo Pereira is a PhD candidate in Neuroscience co-advised by Joshua Shaevitz and Mala Murthy at Princeton University. His research interests focus on the application of computational methods, including deep learning and computer vision, to better understand the brain through quantification of animal behavior. He has received the NSF Graduate Research Fellowship and is a Porter Ogden Jacobus Fellow, Princeton University’s highest honor awarded to PhD students. Recently, he was a Student Researcher at Google AI working on advancing deep learning methods for action recognition.

 

       

Contact:

 

Laurie Tzodikov

Princeton University Office of Technology Licensing

(609) 258-7256 • tzodikov@princeton.edu

 

Sean King

Princeton University Office of Technology Licensing

sbking@princeton.edu

 

 

 

 

 

 

Patent Information:
For Information, Contact:
Cortney Cavanaugh
New Ventures and Licensing associate
Princeton University
609-258-7256
ccavanaugh@princeton.edu
Inventors:
Joshua Shaevitz
Mala Murthy
Talmo Pereira
Diego Aldarondo
Keywords: