S118924 Tracking Animal Behavior – Block Seminar
Description
New advances in computer vision and machine learning have changed the way we measure spatiotemporal dynamics of animal movement from video data. But do we really understand Animal Behavior? This hands-on seminar will introduce new advances in the field of computational neuroethology and teach you how to use animal tracking software such as DeepLabCut (Mathis et al., 2018) and BORIS (Friard et al., 2016) to analyze animal behavior. We will learn basic python skills for data analysis as well as state-of-the-art computer vision techniques to analyze video data. You will work on individual projects to gain practical experience and will end discussing the boundaries of what constitutes behavior. From mere location in space, to body pose, movement and goal-orientedness.
Expectations
Programming skills are not necessary, but technical affinity and basic computer skills will be advantageous. The course language may be either English or German, depending on students’ background. The Block seminar will consist of one introductory session and six seminar days, grouped in three consecutive blocks. The final schedule will be discussed on Friday, October 22nd, 2021. The course will consist of lectures, discussions, group improvs and hands on exercises. Group improvs are an active learning strategy consisting of short (unprepared) presentations of ongoing group projects. The seminar will not be graded, but active participation and classroom interaction is expected.
Course Handbook
Access the course handbook from here:
References and Recommended Literature
Friard, O., & Gamba, M. (2016). BORIS: a free, versatile open‐source event‐logging software for video/audio coding and live observations. Methods in Ecology and Evolution, 7(11), 1325–1330. https://doi.org/10.1111/2041-210X.12584
Lauer, J., Zhou, M., Ye, S., Menegas, W., Nath, T., Rahman, M. M., Di Santo, V., Soberanes, D., Feng, G., Murthy, V. N., Lauder, G., Dulac, C., Mathis, M. W., & Mathis, A. (2021). Multi-animal pose estimation and tracking with DeepLabCut [Preprint]. Animal Behavior and Cognition. https://doi.org/10.1101/2021.04.30.442096
Mathis, A., Mamidanna, P., Cury, K. M., Abe, T., Murthy, V. N., Mathis, M. W., & Bethge, M. (2018). DeepLabCut: Markerless pose estimation of user-defined body parts with deep learning. Nature Neuroscience, 21(9), 1281–1289. https://doi.org/10.1038/s41593-018-0209-y
Mathis, A., Schneider, S., Lauer, J., & Mathis, M. W. (2020). A Primer on Motion Capture with Deep Learning: Principles, Pitfalls, and Perspectives. Neuron, 108(1), 44–65. https://doi.org/10.1016/j.neuron.2020.09.017
Mathis, M. W., & Mathis, A. (2020). Deep learning tools for the measurement of animal behavior in neuroscience. Current Opinion in Neurobiology, 60, 1–11. https://doi.org/10.1016/j.conb.2019.10.008
Nath, T., Mathis, A., Chen, A. C., Patel, A., Bethge, M., & Mathis, M. W. (2019). Using DeepLabCut for 3D markerless pose estimation across species and behaviors. Nature Protocols, 14(7), 2152–2176. https://doi.org/10.1038/s41596-019-0176-0