• AgreeableLandscape@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    4 years ago

    The biggest problem is that “backing up” or “uploading” a brain to a computer is not “transferring” it. If you back up your brain and then your own brain dies, you won’t be in the computer, you’ll just be dead. A replica that thinks like you will be in the computer.

    • DrivingForce@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      4 years ago

      Sometimes I think society conceptualize’s individuality and consciousness wrong. We are being recreated by quantum mechanics all the time. Why do we consider ourselves one person continuously with that in mind. What is so different about uploading to a computer? Is the you in the computer not the same you as the one in the flesh?

      • AgreeableLandscape@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        4 years ago

        Thing is, it’s impossible to know if your consciousness will remain after a process or if you will die and a new consciousness that thinks like you created until you try it. If you continue existing, then great, but if you are replaced by a replica, anyone observing you has no way of knowing because the replica will think that it existed prior to its creation.

        • DrivingForce@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          4 years ago

          If it has full memories of existing before it’s body was created then it is you? I would say think so. A perfect copy of me is just as much me as I am. A digital upload is just a less than perfect copy of a flesh substrate.

          • AgreeableLandscape@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            4 years ago

            My point is that you won’t know for sure until the process is done and you either wake up in the computer or you never experience anything again (or end up in wherever consciousness goes after death, you can’t be sure of that either until you actually die). That’s what’s terrifying about it.

            Also, an upload creates a copy, it does not cleanly transfer something, so which copy is the real you? If flesh and blood you and computer you exist simultaneously and experience different things, the two minds will diverge. Then which is the real you?

            • DrivingForce@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              4 years ago

              Both copies are the real me in the same way a cloned git repo is the same project. If they diverge but still consider themselves the same project, they are in fact the same project. They would just have some differences on top of a base sameness.

              That is how I think of it anyway.

            • abbenm@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              4 years ago

              I think the great thing you’ve identified is that, even if you really did go dark, you wouldn’t be able to say so, and the copy would say everything was fine, because they inherited all your memories. Unless, say, we could do something fancy with the reconstructed copy to purposely change it in a deliberate way, like put a watermark over their vision saying “You are the copy!”

              But, as I said before, in practical terms, I think you really can know if you were transferred in many cases, given what we do know.

              For instance, for one common example, there’s the thing about a hypothetical teleporter that disintegrates you and rebuilds a copy of you somewhere else. In that case, you go dark, and it’s your copy that’s somewhere else.

              For a more vulgar example, if a perfect clone of me was made on the other side of the world, and I jumped into a volcano, it’s pretty clear that I wouldn’t experience a “transfer” of myself to that new body.

              And these aren’t frivolous examples, by the way. A lot of the talk of being “copied” contemplates variations on the theme of dispensing with the original and making a copy somewhere else, and the answer to whether “you” are still there comes from a mundane examination of the practical details of whatever the method is of copying.

              Presumably, if we got to the point of being able to copy people, we might hopefully know enough about how actual transference works and it (hopefully) wouldn’t be an open question, and our answer would depend on all kinds of boring structural details about how consciousness works.

              Anyway, long story short, I think we can chip away at a lot of the mystery with boringly practical answers in a way that steers clear of any encounter with a profound philosophical mystery. And while we shouldn’t be so cavalier about throwing away the Big Question (is it The Real You), I think a disposition for plain answers that focus on boring details probably helps steer clear of traps that people get stuck in.

              Also, an upload creates a copy, it does not cleanly transfer something, so which copy is the real you? If flesh and blood you and computer you exist simultaneously and experience different things, the two minds will diverge. Then which is the real you?

              I 100% agree. I think that’s exactly one of the traps that people get stuck in. If you exist side by side, your experiences won’t be exactly parallel. You can meaningfully answer by distinguishing between instances, at least, and noting the continuity of consciousness across time that one enjoys, while the other was recently poofed into existence. The fleshy one, that’s you. Done!

              But here’s what I don’t get about the path people take in these philosophical jeremiads. What’s with the ending, in paralyzed fear by a question? People seem to go there, or even want to go there, like it’s the goal of the whole conversation. I’ve seen this so many times. Ending with “who is the REAL you?” or “how do you decide?” Not that those are illegitimate questions, but… it’s like you’re trying to psyche someone out by trying to entrap them in a Chris Angel MindFreak, and using the tools of philosophy to find a way to get into that particularly panicked headspace. That should not be the goal. I think those are merely pit stops on the way to a continued examination of the question, not end points, and I think some people treat them as endpoints because they value the experience of being in that headspace, which I don’t get.

    • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      4 years ago

      Whether that’s the case or not really depends on the process. For example, imagine if you could replace individual neurons with artificial ones while you were still conscious. You wouldn’t notice losing a single neuron, and if you replaced all the neurons over time, then by the end of the process you will have transferred your consciousness to a new substrate. Obviously, that isn’t a practical approach. However, it shows that this is possible to do in principle.

      A more realistic option would be to integrate an artificial system into the corpus callosum. Our brain is already split into two independent hemispheres that communicate over this channel. So, you could have the new hemisphere integrate with it, and map out one of the original hemispheres to the point where it’s able to mirror it. Then you could swap out each part of the brain in turn with an artificial version.

      • abbenm@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        4 years ago

        You can, and this is a great point. However (1) I think most of the time these hypotheticals get discussed, this is never contemplated or clarified at all, and generally people are just talking about copies rather than transfer of a single continuous identified.

        And (2), if we’re really cracking this open, we may have to confront the idea that this kind of transfer is already happening all the time biologically. And we may have to accept that there isn’t a single entity preserved across time. And if we really, really, really dig, it’s not just about physical substrate, it’s how much of yourself you lose to lost memories. And I think we have to take the passage of time seriously, and note that past instances of you are lost, in a sense. It’s like Derek Parfit’s quote about the glass tunnel, essentially.

        • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          4 years ago

          Right, we’re constantly changing over time and we’re not the same person we were in the past. We have the notion of continuity and persistent identity over time, but it’s just a mental construct.

          Another interesting aspect is that we’re technically hive minds since each hemisphere can support a human consciousness all on its own. This is a great article on the subject.

          • abbenm@lemmy.ml
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            4 years ago

            You know, on re-reading my comment I thought I was digressing too much and was going to lose people. But you actually picked up the ball and advanced it farther down the field. Well done! I am new to this idea that we are (or are possibly?) hiveminds already, so I’ll take a look at the article.

            Also I had a related thought: suppose, like @AgreeableLandscape@lemmy.ml, we have a concern about whether or not we were “really” copied. In some far-off-future where We Have The Technology, perhaps before fully “transferring” a person, you could wake them up in a state where they are simultaneously present in their own body, and in whatever medium they are being copied to. Then, when satisfied, such a person can “approve” their complete transfer, satisfied that their stream of consciousness would be sustained without disconnection, on whatever their new medium is. Although it would be super trippy to experience having two bodies, or two streams of thought (or whatever) at the same time during the intermediary phase.

            • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              4 years ago

              Right, I think the continuation of consciousness is key to knowing that you’re not a copy. If the process is done in such a way where you remain conscious throughout then you know that you’re a continuation of the same process. Learning to be me is a fun short story exploring this idea as well.

              Also worth considering that a separate physical body isn’t the only way a mind could be extended. For example, artificial parts of the brain could integrate with virtual reality or the internet in general. So, it doesn’t need to even be a physical copy just your mind expanding into new domains that were inaccessible before. The whole idea of having a single body could become obsolete. And it’s reasonable to imagine our minds could adapt to this seeing how we experience similar thing when we play video games. We see our character as a third person avatar we control in many games, and the experience can feel quite immersive.

      • AgreeableLandscape@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        4 years ago

        I feel like brain backup technology is still in the realm of sci-fi right now and it’s too early to say what can and can’t work and what could silently kill you and create a replica of you. We also have no idea how consciousness works.

        • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          4 years ago

          We obviously don’t have the technology to do this right now, but we can make thought experiments about it. While we don’t fully understand consciousness, that’s not the same as saying we have no idea at all about the nature of consciousness. Fundamentally, any procedure where you remain conscious throughout is not creating a replica since there is no disruption in the conscious process.