The BRAIN Initiative: The race to understand the human brain
The History of Grand Challenges
How can we inspire great changes within ten years?
Countries around the world have a long history of investing in Grand Challenges. In the 1960s, JFK inspired Americans in this legendary speech to go to the moon within 10 years. In the 1980s, the Japan’s Ministry of International Trade and Industry launched the “Fifth Generation Computer”. Other notable examples include the DARPA Autonomous Vehicle Challenge and the Gates Foundation Grand Challenge in Global Health.
Now 50 years later, many countries are focusing internally — on the brain. The EU has launched the 10-year scientific Human Brain Project established in 2013. China launched the 15-year China Brain Project in March 2016. Japan is working on Brain/MINDs, Brain Mapping by Integrated Neurotechnologies for Disease Studies, which launched in 2014.
In 2013, inspired by the success of the Human Genome Project, the White House announced another Grand Challenge, cleverly named the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative. This blog post dives into the US’s approach for advancing our understanding of the brain.
So, how do we tackle this Grand Challenge?
We know most people don’t have time to read the 146 page BRAIN 2025: Scientific Vision, so we’ve synthesized the most important highlights.
As humans, we know surprisingly little about how the circuits in brain work and how the brain and nervous system responds to stimuli. We’re still learning the basic structure of the brain and nervous system, how to analyze it— and even the words we use to describe what’s happening.
Understanding the brain better can help us both enhance human potential and decrease the rate of brain disorders in our society. The BRAIN 2025 report aptly noted:
Brain disorders limit personal independence and place enormous demands on family and society…. The burden of brain disorders is enormous. All of us are touched, directly or indirectly, by the ravages of degenerative diseases like Alzheimer’s and Parkinson’s, thought disorders like schizophrenia, mood and anxiety disorders like depression and post‐traumatic stress disorder, and developmental disorders like autism spectrum disorders.
To accelerate technology development, the working group laid out seven priority research areas. The first 3 areas focus on identifying and characterizing the cell types in the nervous system and the tools to record and manipulate those cells. The initiative seeks to improve maps and circuit diagrams of the brain and have a dynamic view of how that neural activity changes over time (both naturally and stimulated).
The fourth area of focus is on demonstrating causality where researchers are able to link brain activity to behavior. There’s a new generation of tools that will likely assist with this process including optogenetics, and biochemical and electromagnetic modulation. [Editors Note: Want to learn more about neuromodulation? Check out the NeurotechX Neuromodulation 101 blog post.]
The fifth area is to identify fundamental principles to create a foundation for understanding the biological basis of mental processes. The sixth area seeks to advance human neuroscience and develop technologies for human brains, rather than animal models. The final area — and the team’s self-defined most important outcome — is a “single, integrated science of cells, circuits, brain, and behavior.” And that’s why it’s a Grand Challenge.
So, what will this mean for me?
New treatments to repair and improve the brain will be available. We’ll have a firmer understanding of how the brain works, and we’ll likely have biological markers for conditions that have traditionally lacked these features (e.g., for neurological conditions such as TBI, PTSD, depression).
We will have better neuroimaging capabilities which will be both helpful for understanding the brain and also for biofeedback-based technologies in our future (e.g., brain computer interfaces).
New jobs will emerge. Tracking, recording and analyzing the brain’s data is going to need scientists and experimentalists from a range of data theory and analysis roles —statistics, physics, mathematics, engineering, and computer science — as well as medical researchers and clinicians.
We will face uncomfortable ethical questions as we learn more about the brain and we are able to quantify and predict more behavior. The BRAIN Initiative laid out a few, and two have stayed with me over the past months:
- Can civil litigation involving damages for pain and suffering be informed by objective measurements of central pain states in the brain?
- Under what circumstances should mechanistic understanding of addiction and other neuropsychiatric disorders be used to judge accountability in our legal system?
In future pieces, we’ll dive more into these ethical considerations and ways to think about the future we are building.
How big is the investment and who is investing?
The White House proposed initial expenditures for fiscal year 2014 of approximately $110 million from the Defense Advanced Research Projects Agency (DARPA), the National Institutes of Health (NIH), and the National Science Foundation (NSF). Intelligence Advanced Research Projects Activity (IARPA) also joined the BRAIN Initiative by launching Machine Intelligence from Cortical Networks (MICrONS), which seeks to reverse-engineer the algorithms of the brain to revolutionize machine learning.
Over the 10-year life of the initiative, the proposed spending will be around $5 Billion, ramping up to $500M funding available each year (BRAIN 2025). These funds are split across neuroscience, neurotechology and infrastructure investments.
Many of the cost estimates are optimistic, and the BRAIN Initiative notes that developing new stimulating or recording devices for humans may cost $100–200M per device for FDA approval. The Initiative views itself as the initial catalyst in this process, but seeks co-investment with other government agencies, private foundations and industry.
There are many venture capitalists who are already investing in neurotechnologies, and this new source will likely be a boon for new technologies.
How was the initiative received by policy makers?
In December 2016, Congress enacted the 21st Century Cures Act, which allocated funding to the Precision Medicine Initiative, the Cancer Moonshot Initiative and the BRAIN Initiative — as well as put policies in place to encourage the use of EMRs and modernize the FDA. In a rare instance of bipartisan support, the Cure’s Act was well-received by both Democrats and Republicans and is not expected to be at risk with the new administration.
Although the 21 Century Cures Act received wide bipartisan support, it needs to be appropriated every year, as does the main BRAIN budget — putting both at risk given the Trump Administration’s proposal for substantial NIH budget cuts (~18%).
What is the BRAIN Initiative missing?
More broadly, criticisms of the initiative have stemmed from the tension of focusing on translational research at the expense of clinical research. John Markowitz, a professor of clinical psychiatry at Columbia and a research psychiatrist at the New York State Psychiatric Institute, wrote a seminal Op-Ed in the New York Times titled “There’s Such a Thing as Too Much Neuroscience.”
In his piece, Markowitz outlines how the focus on neuroscience, translational research and the desire to find underlying biomarkers or “neurosignatures” can take too long to bring into clinical practice. Another approach would be to focusing on developing therapies and interventions that we have started to see work — even if we don’t know “why” they work yet. Markowitz argues that although we need both types of research, we should continue to invest in clinical research that has empirical evidence.
Society will benefit if we are able to commercialize products based on promising empirical evidence, which may otherwise be lost if funding is disproportionately allocated towards neuroscience research.
This criticism is not new to the BRAIN Initiative, and in the 2025 report, the team broadly defined the two types of desired technology development to better balance between research and clinical tools:
1) Research tools that allow us to better investigate brain structure and function, and (2) Clinical tools that enable us to better diagnose, prevent, treat, and cure brain diseases, including technologies that can restore lost functions.
One of the best ways to look into the future, is to peek into the funding sources today. This funding incentivizes the type of research and commercialization that we’ll see over the next 10 years. The working team started distributing the investments last year, in 2016, for the BRAIN Initiative.
If you check out the official NIH website for the BRAIN Initiative, under the Funding tab you can find Funded awards and select a category (e.g., Next Generation Human Invasive Devices) to get a sense for what NIH is funding. You can get more information on each grant from the public NIH Reporter site as well. BrainInitiative.org is a (little fancier) private website put together by the BRAIN Initiative Alliance that lists funding opportunities.
And the investments are already making shifts. Harvard Medical School is using the funding to invest in a brain implant that could restore vision to the blind.
If the physical brain is the “hardware”, and the electrical circuits are the “software”, David Cox’s Lab is using the funds to developing software algorithms that model and predict brain behavior.
The funding is incentivizing changes in industry, too. Kernel, a start-up in California, is building advanced neural interfaces to treat disease. Kernel started with developing brain implants, and recently has recently refocused on developing new technologies to record neurons capitalizing on BRAIN Initiative funding.
We’re only a few years in — and we have 9 years to go. What will we have at the end of it? Unlike early Grand Challenges that had relatively clear endpoints (e.g., “put a man on the moon”), the BRAIN Initiative is more broadly distributed in its goals. Would it have been better to be more targeted? Is it good to stimulate the research community and see what evolves?
Time will tell.
Want more posts like these from NeuroTechX? Tap the ❤ below so more people can see this.