A team of researchers led by DMCBH member Dr. Tim Murphy has created a synthetic animated mouse which can be used to train machine learning algorithms to analyze animal behaviour, the details of which were published this week in Nature Methods.
The project got its start during the COVID-19 pandemic when research at UBC was curtailed. Dr. Murphy teamed up with computer scientist Dr. Helge Rhodin, who had recently joined the UBC community, to explore the possibility of creating a synthetic animated mouse that could aid research efforts, specifically in the realm of automated behavioural analysis, well beyond the pandemic.
Synthetic data provides a unique opportunity to achieve this by generating body part labels that are difficult or near impossible to annotate by hand and by making it easier to control variation in datasets. This concept has long been sought after in the neuroscience community—scoring animal movements by hand can be tedious, time consuming and can introduce biases.
“The synthetic animated mouse is very powerful when used in combination with DeepLabCut by removing the tedious hand labelling of videos,” says co-author Jeff LeDue. “We will see the biggest impact when this strategy is adopted in large team science projects and the benefits are multiplied across labs around the world.”
The team started by taking co-author Dr. Nancy Ford’s pre-existing scans of real mice and turning them into 3D models. The models were then animated into short sequences, which involved having the mouse in repeated and randomized poses, and different “scenes” were created, such as a running-wheel experiment. Next, there was a style transfer step borrowed from the technology that underlies popular “deepfakes” which made the synthetic animated mouse appear more realistic–nearly indistinguishable from real animals. Much of this work was done using remotely accessed computers in the NeuroImaging and NeuroComputation Centre at the DMCBH.
A unique aspect of this research is that the first author on the paper is an undergraduate student. Luis Bolaños got his start in the Murphy lab as an intern when he was in grade 11 through Terry Fox Secondary School just outside of Vancouver. Now a third-year undergraduate student at UBC in Computer Sciences, Bolaños had his coop cancelled because of the pandemic and worked with the team over the summer creating the mouse, and most importantly proving it was useful. Bolaños—along with post-doctoral fellow Dr. Dongsheng Xiao—showed that synthetic mouse videos were as effective as real videos in training machine learning models to recognize subtle mouse behaviors, but with a significant time savings and potentially infinite scale.
Of course, developing this type of model takes time and requires a certain level of expertise with animation software. The next step in this project involves creating a script so other labs won’t necessarily have to be skilled in animation software. Regardless, Dr. Murphy says the true power of using a synthetic mouse model is its flexibility.
“Machine learning models necessary to classify behavior thrive off diversity and fail when they are trained on homogeneous scenes, says Dr. Murphy. “The synthetic mouse and synthetic data in general can be used to augment performance by widening the training repertoire with relatively little effort once created. A change in lighting or camera angle could break a model trained on a subset of real data, whereas synthetic data can train for these contingencies.”
The team hosted the project on Open Science Framework, and Dr. Murphy says the ultimate goal of this undertaking was to provide a synthetic animated mouse model to the community as a resource that can depict almost any experimental scenario so that others won’t have to reinvent the wheel.
Murphy and LeDue manage UBC’s Dynamic Brain Circuits in Health and Disease Research Excellence Cluster and NeuroImaging and NeuroComputation Centre. Murphy is a professor in Psychiatry with an associate membership in the School of Biomedical Engineering. Xiao is a scholar within the Canadian Open Neuroscience Platform. The work was supported by grants from the Canadian Institutes for Health Research, the Canadian Partnership for Stroke Recovery and the Canadian Neurophotonics Platform.