top of page

Uploading Your Brain

A year or so ago I had a little argument with a couple friends of mine about whether human consciousness could be transferred into a computer.  (Incidentally, yes, this is why they are my friends.)

I’m still not certain if they were actually advocating for the position they took, or just playing devil’s advocate because it seemed to be flustering me so much, but they argued that human consciousness could be transferred.  The basic idea being that if you could copy all the data out of a human brain and into a storage unit of some kind (say, a hard drive), then destroy the original at the exact same time, what you’ve done is “transfer” consciousness out of a human storage unit and into a digital one.

Now, I don’t have a problem with the basic premise here – that the data in the human brain can be copied/uploaded.  I’m sure it can.  The problem I have is pretending that destroying the “original” – or, as we sometimes call it, murder – doesn’t have any implications for the process.

The conversation bugged me because I couldn’t articulate my opposition very well.  I kept saying something vague about how the two entities (original and copy) would be facing different directions and having different experiences.  But now, finally, months and months later, Susan Schneider did it for me.

Here’s the line that grabbed me: “If Theodore were to undergo this procedure, would he succeed in transferring himself into the digital realm? Or would he, as I suspect, succeed only in killing himself, leaving behind a computational copy of his mind[?]”

Ding, ding, ding.  She hit perfectly on my issues here.

My friends had argued there was no difference.  If my brain is destroyed at the exact same instant that the digital copy becomes live, my stream of experiences is uninterrupted.  All the data in that set of experiences is being perceived by the consciousness in the exact same way.  The opinions, memories, and life-shaping events are all the exact same.  Because there was no divergence in experience, goes the argument, the copy is still fundamentally me.

I don’t agree.  I think Ms. Schneider’s got the right of it.  In essence, what you’re talking about here is a remarkably adept and difficult act of illusion.  You’re destroying one consciousness and creating another at the same time.  Just because you do it under a blanket and try to gloss over the details doesn’t change the fact that a consciousness is being destroyed.  The illusion can easily come apart at the edges with even the slightest deviation from this hypothetical (and, by the way, absolutely perfectly executed) script.  If I (the original) am destroyed just an instant after the copy goes live, it can make all the difference.  In that instant I can change my mind about agreeing to the procedure, and that data is lost.  The “new me” never “knows” that it changed its mind at the last instant; this shows that we’re talking about two consciousnesses, not one.   If I’m destroyed an instant before the copy goes live (instead of simultaneously), then the cracks show.  During that intervening second, there was no “me.”  There is an instant of universal experience that neither I nor my copy ever experienced.  I wasn’t just dead during that time.  I actually didn’t exist at all.

But suddenly, when the switch is flipped, I exist again?

My point here is not that the procedure has to be executed flawlessly to work.  My point here is that these questions demonstrate that you’re not talking about a single consciousness.  You’re talking about two.  The copy may be a perfect copy of my consciousness as it existed at a single point in time.  It may even be “another” me.  But that doesn’t negate the fact that the original exists or existed.  Murdering the original me to try to maintain the illusion of a single consciousness is just that – murder.  (Unless I agree to the procedure, perhaps because I’m terminally ill or something – then it may not be murder, but it’s still a copy, not the same consciousness.)

To put it another way, say your daughter has a pet goldfish and it dies while she’s at school.  You go out and get a new goldfish.  By all outside appearances, it is identical.  It is the same size and completely indistinguishable from the original goldfish.  Its behaviors are completely identical.  You plop it in the bowl, and when your daughter comes home, she has no idea that the first goldfish died.

Just because you successfully tricked her into thinking the new goldfish is the same as the old one, doesn’t mean the first goldfish is still alive.

Another example, more closely tied to the actual scenario.  It seems to me that the safest way to perform this procedure would be to keep the original subject alive during the copy, if possible.  That way, if something went wrong, you could retry it without harming the individual until absolutely necessary.  Of course, that opens the door to the possibility of multiple copies of the same consciousness, and also quite clearly demonstrates a deviation in the knowledge of the copy and the original.  At some point, you’ll have to tell the original person, “Okay, that copy went off without a hitch and we’re going to kill you now.”  That person will then have to agree or change their minds.  Do you kill them anyway?  After all, you have a copy now, so why should this legacy copy’s opinion matter?

What we’re really talking about here is copying data, no different than the way zillions of bits of data move around our world every day.  But when we send a file over email, the original file stays on our hard drive.  When we download an .mp3, a source .mp3 remains on the server.  Take the example of a manuscript.  I have an .ftp file with about 45,000 different copies of the Alex, Rebecca, and Children manuscripts.  (Especially Children.  Yeesh.)  If I send someone one of those files, they can edit it, delete it, or do whatever they want with it.  Likewise, I keep an original, which I can also change, merge with something else, or do anything else with.  If I delete the original, we pretend that it’s just one continuous document and we all agree to only use the new one that was sent over.  But the two documents are completely separate.  I may have deleted the original, but I could just as easily have continued editing it, kept it for nostalgia purposes, or just totally forgotten I had it.  If that had happened, the two files would’ve diverged very quickly.  Any changes in one would not affect the other.  They become unique sets of data.

Of course, deleting a manuscript copy is no big deal (provided there’s a backup).   But when you stop talking about “just data” and you start talking about consciousnesses embedded in a file (or a human brain!), it seems to me that you need to re-examine all the assumptions you ever held about the sanctity of data.

Deleting information is one thing.  Deleting enough information that it’s actually considered a “consciousness” is something else.

That then opens a whole host of new questions: Is it ethical to delete the consciousness as long as it’s not “active?”  For example, if it’s actually just a file of information, but not perceiving, thinking, or interacting with the world in any way?  Should a file on a server be treated the same as a sleeping person, or maybe as a comatose person?

Fascinating questions.  I should write a book about this.

Recent Posts

See All

You Ever Had This One?

You know those nights where you’re walking up the street with your wife and the tornado siren goes off? When the storm whips up and you...

Thunder

Excuse me while I nerd out hard for a second. So it’s a couple years old, but there is a ridiculous video making the rounds today of a...

You Can Breathe Here

My daughter told me existence was too big for her. She is ten years old and is overwhelmed by the inevitability of death, the seeming...

Kommentare


bottom of page