S112726 Computational Ethology - Project Seminar in Biopsychology
What is behavior, and how can we measure it? Experimental Psychologists operationalize behavior as task relevant interactions with stimuli and rewards, such as frequency of key pecking, reaction times, and the number of errors is specific tasks, while biologists classify behavior in qualitative clusters and analyze time spent on grooming behavior, or the frequency and intensity of aggressive interactions. New advances in computer vision and machine learning have changed the way we measure spatiotemporal dynamics of animal movement, but do we really understand Animal Behavior? In this Seminar we will work with video data of human and non-human animals in different setting and apply cutting edge machine learning techniques to extract spatial and temporal data to describe the continuous stream of behavior.
Programming skills are not necessary, but technical affinity and basic computer skills will be advantageous. The course language may be either English or German, depending on students’ background. The seminar will not be graded, but active participation and classroom interaction is expected. Students will have to read some of the provided literature prior to the respective paper-discussions. Wherever possible, the content of the seminar will be tailored to prospective bachelor dissertation projects. After this seminar students will have learned to use Python for data analysis, as well as some state-of-the-art machine learning techniques for computational ethology such as DeepLabCut (Mathis et al., 2018), Anipose (Karashchuk et al., 2020) and VAME (Luxem et al., 2020).
Access the course handbook from here:
References and Recommended Literature
Anderson, D. J., & Perona, P. (2014). Toward a Science of Computational Ethology. Neuron, 84(1), 18–31. https://doi.org/10.1016/j.neuron.2014.09.005
Calhoun, A., & Hady, A. E. (2021). What is behavior? No seriously, what is it? [Preprint]. Animal Behavior and Cognition. https://doi.org/10.1101/2021.07.04.451053
Dunn, T. W. (2021). Geometric deep learning enables 3D kinematic profiling across species and environments. Nature Methods, 18, 17.
Gomez-Marin, A., Paton, J. J., Kampff, A. R., Costa, R. M., & Mainen, Z. F. (2014). Big behavioral data: Psychology, ethology and the foundations of neuroscience. Nature Neuroscience, 17(11), 1455–1462. https://doi.org/10.1038/nn.3812
Karashchuk, P., Rupp, K. L., Dickinson, E. S., Sanders, E., Azim, E., Brunton, B. W., & Tuthill, J. C. (2020). Anipose: A toolkit for robust markerless 3D pose estimation [Preprint]. Neuroscience. https://doi.org/10.1101/2020.05.26.117325
Lauer, J., Zhou, M., Ye, S., Menegas, W., Nath, T., Rahman, M. M., Di Santo, V., Soberanes, D., Feng, G., Murthy, V. N., Lauder, G., Dulac, C., Mathis, M. W., & Mathis, A. (2021). Multi-animal pose estimation and tracking with DeepLabCut [Preprint]. Animal Behavior and Cognition. https://doi.org/10.1101/2021.04.30.442096
Luxem, K., Fuhrmann, F., Kürsch, J., Remy, S., & Bauer, P. (2020). Identifying Behavioral Structure from Deep Variational Embeddings of Animal Motion. https://doi.org/10.1101/2020.05.14.095430
Mathis, A., Mamidanna, P., Cury, K. M., Abe, T., Murthy, V. N., Mathis, M. W., & Bethge, M. (2018). DeepLabCut: Markerless pose estimation of user-defined body parts with deep learning. Nature Neuroscience, 21(9), 1281–1289. https://doi.org/10.1038/s41593-018-0209-y
Nath, T., Mathis, A., Chen, A. C., Patel, A., Bethge, M., & Mathis, M. W. (2019). Using DeepLabCut for 3D markerless pose estimation across species and behaviors. Nature Protocols, 14(7), 2152–2176. https://doi.org/10.1038/s41596-019-0176-0
Pereira, T. D., Shaevitz, J. W., & Murthy, M. (2020). Quantifying behavior to understand the brain. Nature Neuroscience, 23(12), 1537–1549. https://doi.org/10.1038/s41593-020-00734-z