by Mark R. Waser (originally appeared April 24, 2013 at Transhumanity.Net)
George Dvorsky posted You’ll Probably Never Upload Your Mind Into A Computer in the “DEBUNKERY” section of io9 stating “Here are eight reasons why your brain may never be digitized.” The first problem, of course, is that mind and brain are two different things – as different as a software program and the computer it is running on. The second problem is that he then pulls in yet more confusion with “consciousness uploads” and “transfer our conscious thoughts to a computer”.
Ben Goertzel then weighed in with Goertzel Contra Dvorsky on Mind Uploading. Ben states that he prefers Randal Koene’s phrase “substrate independent minds” as less sexy but less misleading – but then goes on to state that “the point is the liberation of the mind from any particular substrate.”
Here’s *my* take.
First off, substrate is a red herring. The true issue is configuration and input/output. If you simulate my brain hardware accurately enough, the mind/consciousness program will run the same. But, as usual, the devil is in the details of exactly what “accurately enough” is—and it is certainly far, *far* more than just simulating synapses.
There is also the difference between *transferring” your mind/consciousness/program and creating a copy of your mind/consciousness/program. Some people find the distinction irrelevant. My hindbrain says that it is critical. But, in any case, the scene described in John Scalzi’s Old Man’s War, where his soul is basically transferred leaving his body an unmotivated but still living shell, just isn’t going to happen.
Ben concludes by saying that “the arguments Dvorsky raises against mind uploading basically boil down to: Many people are confused about the philosophy of mind, and they argue about it a lot….”. I argue that Ben doesn’t clear up the confusion either. So here is my (short) take on each.
1. Brain functions are not computable
How close is close enough? Brain functions are eventually computable to any reasonable arbitrary level of accuracy.
2. We’ll never solve the hard problem of consciousness
There is no “hard problem of consciousness”. As I’ve explained previously, we are an experiencing system and a modeling/explaining system. The modeling/explaining system is a small wholly-contained subset of the experiencing system. Complexity/information science clearly states that it is IMPOSSIBLE for a subset to explain the whole which contains it. We will always experience more than we can model/explain. This will be true of any system that models itself.
3. We’ll never solve the binding problem
Again, how close is close enough? It’s easy enough to argue that the human brain hasn’t solved the binding problem either – it just has a good enough approximation.
4. Panpsychism is true
First, you have to define what consciousness is. My claim is that it is the experience of a system modeling itself while experiencing. This requires self-modifying feedback loops. Self-modifying feedback loops do not exist everywhere so Panpsychism is false. Any matter could be turned into or made part of a self-modifying feedback system – but that isn’t what Panpsychism requires.
5. Mind-body dualism is true
FAIL! Descartes’ student Elisabeth of Bohemia drove a stake through dualism’s heart that Descartes could never answer. Why is this canard still bruited about?
6. It would be unethical to develop
First you have to define what ethics are. I normally use social psychologist Jonathan Haidt’s functional definition that morality is what reduces selfishness and allows us to live together cooperatively. I don’t see any fundamental conflict between that function and uploading.
7. We can never be sure it works
First, you have to define success. For the most common definition of “transfer”, you are NEVER going to transfer consciousness. You *will* be able to create copies of consciousness . And once we are describing copying, the question again becomes, how close is close enough?
8. Uploaded minds would be vulnerable to hacking and abuse
Yes . . . just as human minds are now . . . and copied minds may well have better methods for self-defense with sufficient additional knowledge of surroundings and self.
Or, in other words – Yes, eventually we could upload our brains – BUT I would expect non-human intelligence long before then and the vast majority of our brains probably aren’t worth copying (and the issues are nowhere near as convoluted as some people would like them to be).