Researchers have long studied the way mice behave, interact, and react to changes in their environment to help reveal ideas about humans and other animals. This summer, two Randolph students added sound to that body of knowledge.
Katrin Schenk, a Randolph physics professor, has involved students in research on mouse vocalizations for several years. This year, she asked Sarah Grissom ’17 and Graham Southwick ’17 to help find out what humans can learn by actually listening to the mice and judging how the mouse sounds.
“They are trying to rate mouse vocalizations in a way they have never been rated before,” Schenk said. “It may not work at all, which is what research is all about. We’re literally just saying, can you do this?”
Although mouse calls are ultrasonic, computers can modify the sound waves and make them audible for humans. Normally, the calls are analyzed by the shape of a graph of the frequencies. But the researchers wanted to find out what they could learn by rating the emotional prosody of the mouse’s vocalizations. In other words, does the mouse sound excited, happy, sad, or lonely?
“Emotional prosody is based on your tone and how you say things: your pitch, how loud you speak, elongation of words or calls,” said Grissom. “In this case, that’s one of the key markers for these calls that we’re going to look at.”
Understanding more about the way mice communicate “would help validate mouse models of diseases, such as autism, where there is a social deficit,” Grissom added.
“There are several instances when mice make ultrasonic calls,” Southwick explained. “Those instances include when a mother mouse calls to her pups, when two male mice are fighting over territory, and when a male mouse makes a call to a female mouse.” The latter setting was the focus of their study.
The team worked with recordings that a lab at the University of California, San Francisco, made of mouse interactions that included a mouse with Fragile X syndrome, a genetic cause of autism. Southwick worked on preparing the video for study by adding a marker on some frames to help identify and keep track of the male and female mice as they moved around on the screen. This would allow the analysis of the male mouse’s vocalizations to be compared to what the mouse is doing at that point in the recording.
Grissom learned the MatLab programming language and developed a graphical user interface that allows users to listen to mouse calls and then give their judgement on the emotional content of the call.
Their work laid the foundation for continuing study which could, if proven effective, provide another way to analyze how mice are communicating.
“If we find out that our rating does make sense, then it becomes a tool to assess differences between a Fragile X mouse and another mouse,” Schenk said.