Tracking decisions in the brain

IMAGE

IMAGE: Stanford neuroscientists and engineers used neural implants to monitor decisions in the brain, in real time. view more

Credit: Gil Costa

When deciding whether or not you should read this article, you can change your mind several times. While your last choice will be obvious to an audience – keep exploring and reading, or click on another article – any internal considerations you had along the way will be more likely to no one but yourself. That clandestine excuse is the focus of the research, published Jan. 20 in Nature, by Stanford University researchers that will examine how mental reflection is manifested in cloud activity.

These scientists and engineers developed a system that read and encoded the activity of monkey brain cells while the animals were asked to identify whether the animation of moving dots moved slightly left or right. The system has succeeded in the ongoing decision-making process of the monkeys in real time, with ebb and flow uncontrolled along the way.

“I was just watching the activity coded on the screen, not knowing how the dots were moving or what the monkey was doing, and I could tell Sania [Fong], said the laboratory manager, ‘It’s going to make the right choice,’ seconds before the monkey began to move to express that same choice, ‘recalled Diogo Peixoto, a scholar. postgraduate in neurobiology and co-lead author of the paper. “I would find it was right 80 to 90 percent of the time, and that definitely proved that this worked.”

In subsequent experiments, the researchers were even able to influence the final decisions of the monkeys through subliminal treatment of the dot movement.

“Basically, much of our experience is due to persistent neural activity that is not explicitly expressed in behavior, so what is inspiring about this research is that we have shown that we can now identify and explain some of those hidden inner cloud states, “said study lead author William Newsome, Harman Family Provost Professor in the Department of Neurobiology at Stanford University School of Medicine.

“We are opening a window into a world of psychology that has been obscure to science to date,” said Newsome, who is also Director of Vincent VC Woo at the Wu Tsai Institute of Neurosciences.

One decision at a time

Neuroscience studies have generally made decisions on estimating the average activity of brain cell numbers over hundreds of experiments. But this process overlooks the complexity of one decision and the fact that each example of decisions is slightly different: The various factors that affect it at least you choose to read this article today different from the ones that would affect you if you made the same decision tomorrow.

“Psychology is very complex and, when you average over a handful of tests, you miss important details about how we come to our opinions and how we make our choices,” he said. Jessica Verhein, MD / PhD student in neuroscience and co – author of the paper.

For these experiments, the monkeys were ejected with a neural implant about the size of a pink finger that reported the activity of 100 to 200 individual neurons every 10 milliseconds as digital dots were shown marching on screen. The researchers placed this implant in the dorsal premotor cortex and the primary motor cortex because, in previous research, they found that neural signals from these brain areas provide the animals ’conclusions and confidence. in these conclusions.

Each video of moving dots was unique and lasted less than two seconds, and the monkeys reported their decisions as to whether the dots moved right or left only when stimulated – won correct answer given at the right time juice reward. The monkeys clearly signaled their choice, with the right or left button presses displayed.

Inside the monkeys’ brains, however, the decision-making process was less obvious. Neurons communicate through a rapid explosion of acoustic electrical signals, which occur alongside a flurry of other activity in the brain. But Peixoto was able to easily predict the monkeys ’preferences, in part because the activity measurements he saw were first fed through signal processing and coding pipes based on years of work with laboratory of Krishna Shenoy, the Hong Seh and Vivian WM Lim Professor in the School of Engineering and professor, with permission, from neurobiology and bioengineering, and Howard Hughes Medical Institute Researcher.

The Shenoy team had been using the real-time neural decoding method for other purposes. “We are always trying to help people with paralysis by reading out their secrets. For example, they can think about how they want to move their arms and then that intention run through the decoder to move a computer cursor on the screen to sort out messages, “said Shenoy, who is a co – author of the paper. “So we constantly measure neural activity, decode it millisecond by millisecond, and then act quickly on this information accordingly.”

In this particular study, instead of predicting immediate arm movement, the researchers wanted to predict the intent of an upcoming option as reported by arm movement – which ‘needed a new algorithm. Inspired by the work of Roozbeh Kiani, a graduate scholar in the Newsome laboratory, Peixoto and colleagues developed an algorithm that captures the acoustic signals from groups of neurons in the dorsal premotor cortex and the underlying motor cortex and redefine them as “decision. variable.” This variable describes the activity that takes place in the brain before a movement decision.

“With this algorithm, we can make the final decision to disable the monkey’s path before moving its finger, not to mention its arm,” Peixoto said.

Three tests

The researchers maintained that more positive values ​​of the various determinants showed more confidence with the monkey that the dots were moving correctly, but more negative values ​​indicated confidence that the dots move to the left. To test this hypothesis, they performed two tests: one where they stopped the test as soon as the variable hit a certain threshold decision and another where they stopped it when the variable appeared. variable indicates a steep turn of the monkey’s decision.

During the first experiments, the researchers stopped the experiments at five randomly selected stages and, at the positive or negative decision variable stages, the variable predicted a decision. the final of the monkey with about 98 percent accuracy. Prophecies in the second trial, in which the monkey appeared to have changed his mind, were almost as accurate.

Prior to the third experiment, the researchers examined the number of dots they could add during the experiment before the monkey became attracted to the change in stimulation. Then, in the experiment, the researchers placed dots below the visible threshold to see if it would subliminally advance a decision about the monkey. And, even though the new dots were quite flexible, they would sometimes tilt the monkey’s preferences toward whichever direction they were moving. The effect of the new dots was stronger if they were placed early in the test and at any time where the monkey’s decision variable was low – indicating a weak level of proof.

“This last test, led by Jessie [Verhein], in fact allowing us to control some of the common models of decision-making, “Newsome said. According to one such model, humans and animals make decisions based on the cumulative amount of evidence. at the time of testing.But if that were the case, would the bias of the researchers introduced by the new dots have the same effect regardless of when it was introduced. instead, the results seemed to support another model, which states that if a subject has sufficient confidence in the decision-making of the mind, or has spent too long time considering, no they are less likely to consider new evidence.

New questions, new opportunities

Already, Shenoy’s laboratory is repeating these tests with human partners with neural disorders who use the same neural implants. Because of differences between human and non-human primate brains, the results could be surprising.

Possible applications of this system beyond the study of decisions include the study of visual attention, working memory or emotion. The researchers believe that their major technological advancement – the observation and interpretation of latent mental states through real-time cloud charts – should be valuable for brain neuroscience in general, and they are excited to see how other researchers build on their work.

“It is hoped that this research will capture the interest of some undergraduate or new graduate student and engage with these issues and carry the ball forward for the next 40 years,” Shenoy said. .

###

Stanford co-authors include former postdoctoral scholars Roozbeh Kiani (now at New York University), Jonathan C. Kao (now at the University of California, Los Angeles) and Chand Chandrasekaran (now at Boston University); Paul Nuyujukian, assistant professor of bioengineering and surgery; former lab manager Sania Fong and researcher Julian Brown (now at UCSF); and Stephen I. Ryu, professor of electrical engineering (also head of surgery at Palo Alto Medical Foundation). Newsome, Nuyujukian and Shenoy are also members of Stanford Bio-X and the Wu Tsai Institute of Neurosciences.

This research was funded by the Champalimaud Foundation, Portugal; Howard Hughes Medical Institute; National Institutes of Health through the Stanford Medical Specialist Training Program; Simons Foundation Collaboration on the Global Brain; Pew Scholarship in Biomedical Sciences; National Institutes of Health (including Pioneer Director Award); McKnight Scholars Award; National Science Foundation; National Institute on Deafness and Other Communication Disorders; National Institute of Neurological Disorders and Stroke; Defense Advanced Research Projects Group – Office of Biological Technologies (NeuroFAST Award); and the Naval Investigation Office.

.Source