Background & Aims
The accurate quantification of pain behavior in rodents is a challenging, yet critical tool for elucidating the underlying biological processes and mechanistic pathways associated with pain. Recent advances in deep learning computer vision technology has revolutionized the way we can measure behavior in animals1,2,3. This shift towards computer models enables enhanced throughput and accuracy, surpassing the capabilities of manual and traditional computer vision scoring methods. Here, we present two deep learning (DL) applications that have dramatically improved our ability to assess preclinical pain behavior. The first system, DeepView, is a video-based DL application that quantifies rodent behavior on a frame-by-frame basis. The second application, Gait.AI, utilizes DL instance segmentation maps to identify and track rodent body parts for gait analysis on a treadmill system.
Methods
The DeepView application employs supervised learning to train a model for the quantification of rodent behaviors observed in open-field chambers. Researchers scored rodent videos, providing precise examples for each behavior. Utilizing an optical flow technique, the models were trained with static JPG images extracted from videos, and preliminary investigations indicated improved accuracy by representing motion on images.
For the Gait.AI application, instance segmentation models were developed by labeling rodent images obtained from a transparent treadmill system. Masks were created for each body part and background. Considering the pixel size disparity among different body parts, an ensemble of individual models was trained by fine-tuning pre-existing models. This ensemble was then deployed to detect body parts on a frame-by-frame basis from high-speed videos, utilizing positional and pixel data to calculate various temporal, spatial, and relational gait features with high accuracy.
Results
The DeepView application has demonstrated accuracy of detecting and quantifying rodent behavior at, or exceeding that of human scoring. Here we show the DeepView quantification of nocifensive behavior (rear paw licking) in the formalin assay being nearly identical to human scoring in both rat and mouse models.
The application of DL segmentation models for gait assays dramatically improves the detection of each paw, as well as makes gait analysis assays easier to analyze for users. Here we show that historical data using the CleverSys TreadScan system and our DL Gait.AL application provide similar results across rat gait impairment models.
Conclusions
Here we have demonstrated the significant enhancements in accuracy, throughput, and usability achieved through the application of deep learning techniques in two distinct contexts of behavioral pain assays. The success observed in these applications suggests promising avenues for further development and broader utilization of these models across various assays, modalities and research domains. The implications extend beyond the immediate applications discussed, offering opportunities to advance our comprehension of the biological mechanisms of pain behavior as well as improve the drug discovery process for novel therapeutics.
References
1.Mathis et al. 2018. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nature Neuroscience, 21, 1281-1289.
2.Pereira TD et al. 2020. SLEAP: multi-animal pose tracking. bioRxiv, 10.1101/2020.12.04.405159
3.Wiltschko AB et al. 2015. Mapping sub-second structure in mouse behavior. Neuron, 88, 1121-1135.
Presenting Author
Benjamin Adams
Poster Authors
Benjamin Adams
MSc
Eli Lilly & Company
Lead Author
Carlie Priddy
MS
Advanced Testing Labs, Indianapolis
Lead Author
Kyle Kelly
BS
Eli Lilly & Co.
Lead Author
Wenhong Guo
MS
Eli Lilly & Company
Lead Author
Murat Dundar
PhD
Eli Lilly & Co.
Lead Author
Shudanshu Shakhar
Eli Lilly & Co.
Lead Author
Pallav Bora
BS
Eli Lilly & Co.
Lead Author
Kelly Knopp
PhD
Eli Lilly & Co.
Lead Author
Topics
- Novel Experimental/Analytic Approaches/Tools