Comprehensive diagram of the entire mouse brain showing cells detected by DELiVR. Each color represents a different brain region and indicates the spatial distribution of neuronal activity. Credit: Luciano Höher / Helmholtz Munich
× close
Comprehensive diagram of the entire mouse brain showing cells detected by DELiVR. Each color represents a different brain region and indicates the spatial distribution of neuronal activity. Credit: Luciano Höher / Helmholtz Munich
Researchers from Helmholtz München and LMU Munich University Hospital have introduced DELiVR, offering a new AI-based approach to the complex task of brain cell mapping.Here are the findings: published in diary nature method.
Deep learning tools democratize advanced neuroscience by eliminating the need for coding expertise. DELiVR enables biologists to efficiently investigate disease-related spatial cell dynamics, facilitating the development of precision therapeutics to enhance patient care.
Democratizing 3D brain analysis
Many diseases are associated with changes in the expression of specific proteins in the brain. To study these changes, scientists examine how they change during disease progression in model organisms. Imaging the entire mouse brain generates a huge dataset, which requires accurate quantification methods for meaningful interpretation. However, it is difficult to identify labeled cells within large-scale 3D image data.
Artificial intelligence (AI) holds promise for data analysis, but it typically requires extensive data annotation and advanced coding skills, limiting its use to specialized laboratories. Therefore, the research team aimed to overcome these barriers and democratize 3D analysis for broader scientific access.
Virtual reality empowers researchers
To accurately quantify specific cells in brain images, the research team first trained an AI algorithm to identify specific cells in 3D microscopic images. By leveraging virtual reality (VR) for label generation, the researchers immersed themselves in the images and annotated cells directly in 3D. This is a faster and more accurate method than his traditional 2D slice-based approach.
The team then used these VR-generated labels to train an AI algorithm to automatically identify active neurons. They integrated the process of cell detection, matching with a brain atlas, and visualization of the results into his DELiVR (Deep Learning and Virtual Reality Medium-Scale Annotation) pipeline.
The system works seamlessly with Fiji, an open source software for image analysis, in an end-to-end workflow. DELiVR also has customizable features that allow researchers to train against specific cell types, such as microglia, essential immune cells in the brain, demonstrating its adaptability to diverse research projects.
“Essentially, DELiVR provides a seamless solution for identifying and analyzing cells throughout the brain, allowing scientists to understand their role and behavior in both health and disease without requiring coding expertise. DELiVR is a step toward developing new therapeutic interventions that may ultimately improve the quality of life for individuals affected by the debilitating condition. ” said Professor Ali Erturk, who led the development of the tool at Helmholtz Munich.
Use case: Cancer-related weight loss
To demonstrate the power of DELiVR, the research team demonstrated DELiVR’s ability to change our understanding of how cancer affects our brain activity. Focusing on the critical clinical challenge of tumor-induced weight loss, they discovered specific brain activity patterns that distinguish cancers that induce weight loss from those that do not in mice.
Dr. Doris Kaltenecker, lead author of the study that introduced DELiVR, said: “Our findings using DELiVR have revealed potential therapeutic targets within brain regions, which may be useful in the future for cancer-related “This study may pave the way to promising strategies to combat weight loss.”
For more information:
Doris Kaltenecker and others, deep learning analysis of brain cells using virtual reality, nature method (2024). DOI: 10.1038/s41592-024-02245-2