My point is that you won’t know for sure until the process is done and you either wake up in the computer or you never experience anything again (or end up in wherever consciousness goes after death, you can’t be sure of that either until you actually die). That’s what’s terrifying about it.
Also, an upload creates a copy, it does not cleanly transfer something, so which copy is the real you? If flesh and blood you and computer you exist simultaneously and experience different things, the two minds will diverge. Then which is the real you?
I think the great thing you’ve identified is that, even if you really did go dark, you wouldn’t be able to say so, and the copy would say everything was fine, because they inherited all your memories. Unless, say, we could do something fancy with the reconstructed copy to purposely change it in a deliberate way, like put a watermark over their vision saying “You are the copy!”
But, as I said before, in practical terms, I think you really can know if you were transferred in many cases, given what we do know.
For instance, for one common example, there’s the thing about a hypothetical teleporter that disintegrates you and rebuilds a copy of you somewhere else. In that case, you go dark, and it’s your copy that’s somewhere else.
For a more vulgar example, if a perfect clone of me was made on the other side of the world, and I jumped into a volcano, it’s pretty clear that I wouldn’t experience a “transfer” of myself to that new body.
And these aren’t frivolous examples, by the way. A lot of the talk of being “copied” contemplates variations on the theme of dispensing with the original and making a copy somewhere else, and the answer to whether “you” are still there comes from a mundane examination of the practical details of whatever the method is of copying.
Presumably, if we got to the point of being able to copy people, we might hopefully know enough about how actual transference works and it (hopefully) wouldn’t be an open question, and our answer would depend on all kinds of boring structural details about how consciousness works.
Anyway, long story short, I think we can chip away at a lot of the mystery with boringly practical answers in a way that steers clear of any encounter with a profound philosophical mystery. And while we shouldn’t be so cavalier about throwing away the Big Question (is it The Real You), I think a disposition for plain answers that focus on boring details probably helps steer clear of traps that people get stuck in.
Also, an upload creates a copy, it does not cleanly transfer something, so which copy is the real you? If flesh and blood you and computer you exist simultaneously and experience different things, the two minds will diverge. Then which is the real you?
I 100% agree. I think that’s exactly one of the traps that people get stuck in. If you exist side by side, your experiences won’t be exactly parallel. You can meaningfully answer by distinguishing between instances, at least, and noting the continuity of consciousness across time that one enjoys, while the other was recently poofed into existence. The fleshy one, that’s you. Done!
But here’s what I don’t get about the path people take in these philosophical jeremiads. What’s with the ending, in paralyzed fear by a question? People seem to go there, or even want to go there, like it’s the goal of the whole conversation. I’ve seen this so many times. Ending with “who is the REAL you?” or “how do you decide?” Not that those are illegitimate questions, but… it’s like you’re trying to psyche someone out by trying to entrap them in a Chris Angel MindFreak, and using the tools of philosophy to find a way to get into that particularly panicked headspace. That should not be the goal. I think those are merely pit stops on the way to a continued examination of the question, not end points, and I think some people treat them as endpoints because they value the experience of being in that headspace, which I don’t get.
Both copies are the real me in the same way a cloned git repo is the same project. If they diverge but still consider themselves the same project, they are in fact the same project. They would just have some differences on top of a base sameness.
My point is that you won’t know for sure until the process is done and you either wake up in the computer or you never experience anything again (or end up in wherever consciousness goes after death, you can’t be sure of that either until you actually die). That’s what’s terrifying about it.
Also, an upload creates a copy, it does not cleanly transfer something, so which copy is the real you? If flesh and blood you and computer you exist simultaneously and experience different things, the two minds will diverge. Then which is the real you?
I think the great thing you’ve identified is that, even if you really did go dark, you wouldn’t be able to say so, and the copy would say everything was fine, because they inherited all your memories. Unless, say, we could do something fancy with the reconstructed copy to purposely change it in a deliberate way, like put a watermark over their vision saying “You are the copy!”
But, as I said before, in practical terms, I think you really can know if you were transferred in many cases, given what we do know.
For instance, for one common example, there’s the thing about a hypothetical teleporter that disintegrates you and rebuilds a copy of you somewhere else. In that case, you go dark, and it’s your copy that’s somewhere else.
For a more vulgar example, if a perfect clone of me was made on the other side of the world, and I jumped into a volcano, it’s pretty clear that I wouldn’t experience a “transfer” of myself to that new body.
And these aren’t frivolous examples, by the way. A lot of the talk of being “copied” contemplates variations on the theme of dispensing with the original and making a copy somewhere else, and the answer to whether “you” are still there comes from a mundane examination of the practical details of whatever the method is of copying.
Presumably, if we got to the point of being able to copy people, we might hopefully know enough about how actual transference works and it (hopefully) wouldn’t be an open question, and our answer would depend on all kinds of boring structural details about how consciousness works.
Anyway, long story short, I think we can chip away at a lot of the mystery with boringly practical answers in a way that steers clear of any encounter with a profound philosophical mystery. And while we shouldn’t be so cavalier about throwing away the Big Question (is it The Real You), I think a disposition for plain answers that focus on boring details probably helps steer clear of traps that people get stuck in.
I 100% agree. I think that’s exactly one of the traps that people get stuck in. If you exist side by side, your experiences won’t be exactly parallel. You can meaningfully answer by distinguishing between instances, at least, and noting the continuity of consciousness across time that one enjoys, while the other was recently poofed into existence. The fleshy one, that’s you. Done!
But here’s what I don’t get about the path people take in these philosophical jeremiads. What’s with the ending, in paralyzed fear by a question? People seem to go there, or even want to go there, like it’s the goal of the whole conversation. I’ve seen this so many times. Ending with “who is the REAL you?” or “how do you decide?” Not that those are illegitimate questions, but… it’s like you’re trying to psyche someone out by trying to entrap them in a Chris Angel MindFreak, and using the tools of philosophy to find a way to get into that particularly panicked headspace. That should not be the goal. I think those are merely pit stops on the way to a continued examination of the question, not end points, and I think some people treat them as endpoints because they value the experience of being in that headspace, which I don’t get.
Both copies are the real me in the same way a cloned git repo is the same project. If they diverge but still consider themselves the same project, they are in fact the same project. They would just have some differences on top of a base sameness.
That is how I think of it anyway.