Different hardware, same result

Martha Havenith and Marieke Schölvinck, research group leaders at the ESI, compare the natural behavior of rodents and non-human primates. They seek to understand how these two species achieve the same results despite having different brains. In this interview, the two explain their work.


Your research group compares the behavior of mice and monkeys. Why did you choose these species?

Martha Havenith: We work with mice and non-human primates because they are the two animal species most commonly studied in systems neuroscience. We are interested in how a group of neurons performs multiple tasks simultaneously and how the activities of the individual neurons reflect that. In other words, we look at cognitive processes, like learning, memory, or attention, and study them not separately, as many labs do, but simultaneously, to understand how these processes interact and how they overlap. And we do this under naturalistic conditions.

Please describe in more detail how you compare the behavior of the animals.

Marieke Schölvinck: We observe the vision-based decisions and reactions of these animals in a natural context. Whether monkey or mouse, the challenges they face in the wild are complex - and very similar for both animals. For example: Where is the best hiding place for my food? Will I find food in the same place as yesterday? Do I have to search again? Their brains, however, especially their visual cortexes, are quite different. How do these two species act similarly despite different hardware: how do they weigh the effort and risk of reaching a food source against its nutritiousness?

Is this the kind of challenge you present to monkeys and mice in your experiments?

Martha Havenith: Exactly. We send the animals foraging through a virtual reality (VR) environment that mimics a forest and meadow landscape. It allows us to closely observe their visually-driven decisions, deliberations, and reactions in a natural context. The mice navigate the VR world by walking on a large trackball, while the monkeys move through it by controlling the trackball with their hands. The task is the same for both animal species: as they move through the virtual landscape, they are repeatedly presented with two objects, for example, a stone and a nut. They move towards one of the two objects and when they have selected the right one, they receive a reward - juice for the monkeys, soy milk for the mice.

What are you looking for as the animals move through the virtual world?

Marieke Schölvinck: During this simple search task, we follow the animal’s behavior in great detail. We pay particular attention to three specific parameters: 1. paths in VR and how purposeful they are; 2. eye movements and pupil size; 3. videos of the monkeys’ faces and hands and the mice’s faces and bodies, respectively. We can easily capture paths and eye movements by a number that changes over time. But we also need to quantify the videos. To do this, we use a toolbox called DeepLabCut. With DeepLabCut we track the position of specific body parts at any given time, such as wrists, eyebrows, cheeks, or ear tips.

What conclusions can you draw from the accomplishment of this task?

Martha Havenith: We use the behavioral parameters we gain from our observations to estimate the different cognitive states of the animals. For example, if a mouse is highly attentive to the stimuli it is looking for, it might decide on an object more quickly, walk to it more straightforwardly, and start licking the drinking tube earlier because it expects a reward. All of these characteristics combined could then indicate a highly attentive state. Or the animal is only at the beginning of a learning process and confuses different objects despite attentive perception, which a variety of other behavioral parameters will reflect. It is therefore obvious that the more parameters we have to describe animal behavior, the better we can capture and recognize these cognitive states.

What happens next?

Martha Havenith: Once we have all the parameters that track behavior over time, nearly 20 in total, we extract the underlying cognitive states using a combination of mathematical models (a so-called GLM-HMM model). At the moment, we are computing this separately for monkeys and mice because it allows us to compare the animals and see, for example, that different behaviors lead to the same cognitive states in both species. It gives us a first clue about how they solve similar complex challenges like foraging in a diverse sensory environment.

Sounds fascinating! You talk about “first clues” - what will be the next step?

Marieke Schölvinck: The next step for our research group is to measure the activity of large groups of neurons in the visual brain areas during foraging in the virtual environment. It will provide us with snapshots of the diverse behaviors of animals, which we will then relate to various aspects of brain activity using state-of-the-art computational tools. Our goal here is to clarify, for the first time, how universal thought processes such as memory, learning, or attention unfold in real-time in the brains of different animal species. Furthermore, we want to find out whether these processes are species-specific or have always been solved in the same way across species by evolution.

Thank you very much for the interview!


Original Publications
Shapcott KA, Weigand M, Glukhova I, Havenith MN, Schölvinck ML (2022). DomeVR: A setup for experimental control of an immersive dome virtual environment created with Unreal Engine 4. https://doi.org/10.1101/2022.04.04.486889 (preprint, accepted for publication)

Tlaie A, Shapcott KA, van der Plas T, Rowland J, Lees R, Keeling J, Packer A, Tiesinga P, Schölvinck ML, Havenith MN (2022). Does the brain care about averages? A simple test. https://doi.org/10.1101/2021.11.28.469673 (preprint)

van Heukelum S, Mars RB, Guthrie M, Buitelaar JK, Beckmann CF, Tiesinga PHE, Vogt BA, Glennon JC, Havenith MN (2020). Where is cingulate cortex? A cross-species view. Trends Neurosci 43(5), 285-299. https://doi.org/10.1016/j.tins.2020.03.007