A quick look under the skin

IMAGE

IMAGE: First author Oliver Schoppe in the Center for Translational Research on Cancer at the Technical University of Munich. view more

Credit: Astrid Eckert / TUM

Imaging techniques enable a detailed view within an organism. But interpreting the data is time consuming and requires a lot of knowledge. Artificial neural networks open up new possibilities: They only need seconds to define whole-body scans of mice and separate and design the organs in colors, instead of in different gray shapes. This greatly enables the analysis.

How big is the liver? Does it change if medication is taken? Is the kidney inflamed? Is there a tumor in the brain and have metastases already developed? To answer these questions, biologists and doctors have so far had to screen and interpret a wealth of data.

“The analysis of three-dimensional imaging processes is very complex,” explained Oliver Schoppe. Together with an interdisciplinary research team, the TUM researcher has now developed self-learning algorithms to help analyze future biological imaging data.

At the heart of AIMOS software – the abbreviation stands for AI-based mouse organ separation – are artificial cloud networks that, like the human brain, are capable of learning. “You had to tell computer programs exactly what you wanted them to do,” says Schoppe. “Natural networks don’t need guidance like this:“ It’s enough to train them to face problem and solve many times. Gradually, the algorithms begin to identify the relevant patterns and are able to find the right solutions themselves. “

Training self-learning algorithms

In the AIMOS project, the algorithms were trained with the help of images of mice. The goal was to pinpoint the imaging points from the 3D full-body scan to specific organs, such as the stomach, kidneys, liver, spleen, or brain. Based on this specification, the program can display the exact position and shape.

“We were lucky enough to have access to hundreds of images of mice from a different research project, all of which were already described by two biologists,” recalls Schoppe. The team also had access to fluorescence microscopic 3D scans from the Institute for Fine Engineering and Renewable Medicine at Helmholtz Zentrum München.

Through a special device, the researchers were able to remove the dye from already dead mice. The visible bodies could be designed with a step-by-step microscope and cover for cover. The distances between the measurement points were only six micrometers – equivalent to the size of a cell. Biologists had also found the organs in these databases.

Artificial intelligence improves accuracy

At the TranslaTUM the information techs added the data to their new algorithms. And they learned that faster than expected, Schoppe reports: “We only needed about ten full – body scans before the software was able to analyze the image data by itself – and within a second or two. It will take a human hour to do this. “

The team then examined the reliability of the artificial intelligence with the help of 200 full-body scans of mice. “The results show that self-learning algorithms are not only faster in analyzing biological imaging data than humans, but also more accurate,” summarizes the Professor Bjoern Menze, head of the Image-Based Biomedical Modeling group at TranslaTUM at the Technical University of Munich.

The intelligent software is to be used in the future especially in basic research: “Images of mice are essential for, for example, studying the effects of new medicines before being given to humans. Using algorithms self-study to analyze image data in the future will save a lot of time in the future, ”emphasizes Menze.

###

The research was conducted at TranslaTUM, the Center for Translational Research on Cancer, at the Technical University of Munich. The institute is part of TUM rechts der Isar University Hospital and specifically translates the vision of cancer research into practical patient services through interdisciplinary collaboration. Using the novel 3D microscope, the scientists at the TUM worked closely with experts at the Helmholtz Zentrum Munich.

The research project was funded by the German Federal Ministry of Education and Research (BMBF) within the field of Software Campus Initiative, the Deutsche Forschungsgemeinschaft (DFG) through the Munich Cluster group of excellence for Neurology Systems (SyNergy) , as well as its research grant and with the TUM Institute for Advanced Study funded by the German and European Union excellence initiative. The research was also funded by the Fritz Thyssen Foundation. NVIDIA supported the operation of the GPU Contribution Program.

Published:

Oliver Schoppe, Chenchen Pan, Javier Coronel, Hongcheng Mai, Zhouyi Rong, Mihail Ivilinov Todorov, Annemarie Müskes, Fernando Navarro, Hongwei Li, Ali Ertürk, Bjoern H. Menze

Deep multi-organ separation with learning ability in whole-body mouse scans

nature communication, 6.11.2020 – DOI: 10.1038 / s41467-020-19449-7

Disclaimer: AAAS and EurekAlert! they are not responsible for the accuracy of press releases posted to EurekAlert! by sending institutions or for using any information through the EurekAlert system.

.Source