This afternoon, I saw a few posts on twitter on the topic of mind uploading, particularly via a link to this blog post by Athena Andreadis. Not sure why a 2011 post appeared on Twitter today, but it got me thinking. As a neuroscientist and SFF writer, I wanted to give my expert(?) opinion.
First off, do go read Athena’s post — it’s very well-written and reasoned. But one part of it concludes that (to paraphrase) “moving the brain into another container is intrinsically impossible,” and I disagree.
Here’s the key paragraph:
Recall that a particular mind is an emergent property (an artifact, if you prefer the term) of its specific brain – nothing more, but also nothing less. Unless the transfer of a mind retains the brain, there will be no continuity of consciousness. Regardless of what the post-transfer identity may think, the original mind with its associated brain and body will still die – and be aware of the death process. Furthermore, the newly minted person/ality will start diverging from the original the moment it gains consciousness. This is an excellent way to leave a clone-like descendant, but not to become immortal.
Absolutely true. Mind uploading is a vastly harder concept than 1970s computer scientists could have possibly imagined. Consciousness is a property of what the brain is doing. (Most likely for the evolutionary goals of error-checking and causality-determining, but that’s another blog post.) If you somehow copied a mind into a computer, it’d be just that: a copy. This has plenty of interesting implications, but it’s not really a consciousness transfer, and it’s certainly not going to make you immortal — at least, not the current “you.”
Let’s now discuss the possible: in situ replacement [of brain cells].
Here’s the fun part. Do brain cells1 regrow during adult life? Generally no, except for a few narrow parts of the brain. There’s no turnover in our brain cells. But I think that a brain could function if the cells did get replaced. After all, we lose brain cells throughout adulthood, yet that doesn’t impinge on our identity. Those cells form new connections, new ways of accomplishing the same things. Some people with strokes or other localized brain damage can relearn the lost skills — for instance, my aunt lost her ability to speak due to a stoke in her 40s, but over the following couple of years she relearned it. She finds herself at a loss for words more often than most people, but you’d never know she had the stroke. The dead cells did not grow back, but the existing ones learned to pick up the slack. What if we put an artificial chunk of brain in there instead, over where the old stuff died?
We could do this at the single-cell level too. What if we could use microsurgery to replace a damaged nerve cell? Put in an artificial cell that has the same connections, responds to inputs in the same ways, etc.2 It’s not a natural neuron, but its functional role is the same: it does the same things. (We are nowhere near the tech level to build such artificial cells, let alone install them, but it’s certainly possible.) If there are any hiccups during the surgery process, it doesn’t matter: the new cell doesn’t have to exactly copy the current state of the old one. The networks will learn to incorporate that replacement cell, and make it a part of the ever-changing symphony of activity in the brain.3
Replace a cell or a small part of the brain. Then another. And another.4 Each replacement part is functionally equivalent to the old one, remember. Eventually, you get a whole brain made out of the new components. Wait, what are the new components made out of? It doesn’t matter. Maybe they’re culture-grown artificial neurons. Maybe they’re 24th-century biopolymers. Maybe they’re silicon.
And now that mind is running on an artificial device, without any interruption in conscious experience.5
As Athena notes, you certainly don’t want to cut the brain off from sensory and motor experience. The brain evolved to help us act adaptively, after all; a brain in a jar is a nonfunctional brain. But that just means you should put your computerized brain in an android body, or in a vat-grown human body, or — if you really want to call it “uploading” — in a software simulation of human sensory and motor feedback. These are all beyond modern science, but probably easier than getting the brain into silicon in the first place.
Therefore: while the “hardware”6 of the brain is critical to our consciousness, it may be possible to replace that hardware with the electronic.
In a couple centuries, anyways!
- I’m sticking with the term “brain cells” because both neurons and non-neural cells (e.g. glia) play a role in brain functioning.
- This is the weak point in my argument. Maneuvering a thing with cell-like properties in among cells without damaging or disrupting anything, and then hooking them up? It’s not like there’s empty space in there to move through.
- Individual neurons change their roles all the time. This has been shown by neural recordings in nonhuman primates, but it’s also logically necessary. Learning a new skill doesn’t overwrite an old one: instead, over time we multiplex new networks and connections over the same (or fewer!) cells.
- This could be a lot of surgeries. But maybe not. Speaking on pure intuition here: I bet if I magically and instantly replaced 1% of your brain cells with new ones (properly hooked up, but with the wrong connection strengths), you wouldn’t notice, as long as the replacements were widely distributed across the brain. The important action happens at the network level, not the single-cell level.
- At least not beyond what you’d get from anesthesia. But if we’re going science-fictional with implanting and attaching individual brain cells, we could even skip that too.
- The “computer = brain” metaphor is a sloppy thing that I don’t generally endorse, but it’s sure useful in this sentence.