Attempts to engineer the human brain have gained momentum and in frequency over the last decade, not to mention more than $3.5 billion in funding in the coming one.
(This post is intended to introduce the reader to a recent history of large-scale bioengineering projects concerning the human brain. I haven’t had time to delve into the philosophies and techniques of each one; maybe that’ll be another post for another day.)
One of the most complex entities in existence, if not the most complex, is the human brain. The convoluted folds of muscle resting in the cranium are responsible for a plethora of actions, thoughts, and a host of not-quite understood in-betweens.
For want of a credible explanation, hypotheses of what really goes on in it apart from a high-speed relay of electrochemical signals has reached also into quantum physics, a probabilistic realm that doesn’t set much store by causes and effects.
Apart from the theories needed to unfold the pink mass on paper and explain it reassuringly logically, there’s also been the issue of tools. And of late, the kind of options available to choose from has diversified: gone are the days when we’d have to be trapped within purely empirical or purely functional perspectives.
Now, we have enough to bring the two together. Now, advanced computers allow us to both manipulate and modify to help us find what we seek in a way we think we understand or find more useful.
In 2011, there was (and is?) the DARPA-coordinated initiative SyNAPSE, on which IBM Research, HRL Labs, and Hewlett-Packard were working. After two rounds of federal funding from 2008 to 2011, a total of $42 million went to IBM and $34.5 million to HRL.
SyNAPSE is based on what’s called cognitive computing. The idea is to build a machine that mimics the mammalian brain in terms of the number of neurons and inter-neuron connections, i.e., synapses. Then, by making the machine do many things that a brain can, scientists will then study what roles the machine’s various innards played and how they had to be reconfigured.
The whole setup is founded in data-storage and statistical analysis techniques.
Late last year, the Semantic Pointer Architecture Unified Network (Spaun) was announced in Science. It was one of the first models of the brain to move away from the notion of brain-simulators have to boast unbridled sophistication.
Instead, Spaun demonstrated logically simple (but structurally complex) behaviours - like doing arithmetic - with a fewer number of neurons (2.5 million as opposed to the 86 billion in our heads). Scientists at the time deemed it a behavioural switchboard and useful for hypothesising about the brain.
Another project, Blue Brain, was founded in 2005 at the Ecole Polytechnique Federale de Lausanne, Switzerland, with an aim to study how the brain uses different kinds of infrastructure to perform various functions. The project was unique because it was one of the first to set up a biologically realistic model of neurons.
Blue Brain plans to deliver a cellular rat brain by 2014, with a cellular human brain scheduled for 2023. Its successor, the Human Brain Project (HBP), was announced in October, 2012.
Earlier this year, in late-January, HBP was announced as one of the recipients of an EU technologies contest, and designated to receive around $645 million in funding through this decade ($70 million from the EC; the rest from states and other sources).
Its lead scientist is Henry Markram, the same guy who headed the Blue Brain project. Its goal, as Nature wrote, is “to aid medical development in brain disorders.” HBP will accomplish this by putting together a network of supercomputers – a network of supercomputers! – to virtually recreate the organ and then simulate its reactions to various drugs.
2005 was also the year when Eugene Izhikevich used microcircuitry to “create”a human-sized brain with 100 billion neurons and 1,000 trillion synapses.On his page, Izhikevich reported having observed alpha and gamma rhythms in the recreated organ.
And now, on February 17, Barack Obama announced an ambitious project - quite the parallel to the HBP - to study the human brain and map its activities, while receiving over $3 billion in funding. Going by his ‘State of the Union’ speech to Congress, the American government could be hoping to get out of this what it got out of funding the Human Genome Project: $3.8 billion invested between 1990 and 2003; $800 billion reaped between 2003 and 2010.
The details of the project haven’t fully emerged, but like the HBP, this project is also going to be a collaboration between government institutes - specifically, the NIHs - private researchers, and neuroscientists and nanotechnologists.
Even at this stage, there are those for whom the idea is still preposterous. Despite the distance we’ve come in the last few decades, our understanding of the brain is still primordial, they claim. The skepticism, one can imagine, is well-meaning. The processing power of the brain is far, far beyond what some of the best computers can stake claim to today.
In fact, we know even little of the brain beyond our attempts to study its reactions to medication.
Broadly, our approach to recreating the brain on a tabletop has either been top-down - using behaviours to understand mechanisms - or bottom-up - using mechanisms to understand behaviours - and both are only exploratory. This old article at io9 explains the cons of this difference well, and but also articulates an optimistic time-frame for humans ever managing to recreate the human brain: within this century...