• Kyrgizion@lemmy.world
    link
    fedilink
    arrow-up
    80
    arrow-down
    2
    ·
    6 months ago

    I think SOMA made it pretty clear we’re never uploading jack shit, at best we’re making a copy for whom it’ll feel as if they’ve been uploaded, but the original remains behind as well.

    • Dasnap@lemmy.world
      link
      fedilink
      arrow-up
      44
      ·
      6 months ago

      A lot of people don’t realize that a ‘cut & paste’ is actually a ‘copy & delete’.

      And guess what ‘deleting’ is in a consciousness upload?

      • pixeltree@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        14
        arrow-down
        1
        ·
        edit-2
        6 months ago

        I mean, if I die instantaneously and painlessly, and conciousness is seemingly continuous for the surviving copy, why would I care?

        My conciousness might not continue but I lose consciousness every day. Someone exists who is me and lives their (my) life. I totally understand peoples aversion to death but I also don’t see any difference to falling asleep and waking up. You lose consciousness, then a person who’s lived your life and is you regains consciousness. Idk

        • TopRamenBinLaden@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          11
          ·
          edit-2
          6 months ago

          You make a good point. We all might be being copied and deleted in our sleep every night, for all we know.

          There’d be no way to know anything even happened to you as long as your memory was copied over to the new address with the rest of you. It would be just a gap in time to us, like a dreamless sleep.

          • Demdaru@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            6 months ago

            Old post but…if it’s just memory, you’d lose ttauma and other ingrained coping mechanisms, no? There’s no brain to try and fight back against things. Just memories making you…you…? Or not you, if you oose some of your behaviors?

          • pixeltree@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            8
            ·
            edit-2
            6 months ago

            Yeah, and I completely understand that. Just from a logical perspective though, lets say the process happens after you fall asleep normally at night. If you can’t tell it happened, does it matter? I’ve been really desensitized to the idea of dying through suicidal ideation throughout most of my life (much better now), so I’m able to look at it without the normal emotional aversion to it. If teleportation existed, via this same method, I don’t think I’d have qualms about at least trying it. Certainly wouldn’t expect other people to but to me I don’t think it’s that big a deal. I wouldn’t do a mind upload scenario, but moreso due to a complete lack of trust in system maintenance and security, and a doubt that true conciousness can be achieved digitally. If it’s flesh and blood to flesh and blood though? I’d definitely try

    • TheYang@lemmy.world
      link
      fedilink
      arrow-up
      25
      ·
      6 months ago

      I wonder how you ever could “upload” a consciousness without Ship-of-Theseusing a Brain.

      Cyberpunk2077 also has this “upload vs copy” issue, but doesn’t actually make you think about it too hard.

      • KazuyaDarklight@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        6 months ago

        That’s what I’ve always thought more or less, to have a chance you would need a method where mental processing starts to be shared in both, then transfers more and more to the inorganic platform till it’s 100% and the organic isn’t working anymore.

        • Schmoo@slrpnk.net
          link
          fedilink
          arrow-up
          6
          ·
          edit-2
          6 months ago

          The animated series Pantheon has a scene depicting exactly this, and it’s one of the most disturbing things I’ve ever seen.

          Edit: Here is the scene in question. It’s explained he has to be awake during the procedure because the remaining parts of his brain need to continue functioning in tandem with the parts that have already been scanned.

          • KazuyaDarklight@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            6 months ago

            Interesting but I would argue that’s actually still a destructive copy process. “Old Man’s War” did a good job of what I’m talking about, it was body to clone body but the principal was similar and at the halfway point the person was experiencing existence in both bodies at once, seeing both bodies from the perspective of each other until the transfer completed and they were in the new body and the old slumped over.

            • Schmoo@slrpnk.net
              link
              fedilink
              arrow-up
              3
              ·
              edit-2
              6 months ago

              That also reminds me of this scene from Invincible where during the copying process their experiences are sort of “blended” making them see from both bodies at once, only here they both live and are separate afterwards.

              Edit: is it obvious how much of a sci-fi geek I am lol

      • rwhitisissle@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        6 months ago

        You would have to functionally duplicate the exact structure of the brain or its consciousness while having the duplication mechanism destroy the thing it was reading at almost exactly the same time. And even then, that’s not really solving the issue.

        • AEsheron@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          6 months ago

          I don’t see an issue with that. A prolonged brain surgery that meticulously replaces each part with a mechanical equivalent in sequence. Could probably remain conscious the whole time.

          • rwhitisissle@lemmy.ml
            link
            fedilink
            arrow-up
            1
            ·
            6 months ago

            Yeah, but it’s still a Ship of Theseus problem. If you have a ship and replace every single board or plank with a different one, piece by piece, is it still the same ship or a completely different one, albeit an exact replica of the original. It’s important because of philosophical ideas around the existence of the soul and authenticity of the individual and a bunch of other thought-experimenty stuff.

            • AEsheron@lemmy.world
              link
              fedilink
              arrow-up
              5
              ·
              6 months ago

              I think so long as you maintain consciousness that issue is fairly null in this particular circumstance. There’s lots of tolerance for changes in thought while maintaining the same self, see many brain damage victims. So long as there is minimal change in personality, there are lots of other circumstances that have a stronger case for killing one person and having a new person replace them due to change of consciousness, imo, I don’t think most people would consider a brain damaged person killed and replaced by a new consciousness, or a drug addiction with radically altered brain chemistry, etc.

        • bufalo1973@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          6 months ago

          Not necessary. Imagine you begin suffering Alzheimer. And the artificial neurons are making a copy of your brain. Once a neuron stops working the backup one replaces it. Your mind, if it worked, could see the new neuron as part of the same brain and work with it seamlessly.

      • someacnt_@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        6 months ago

        Yeah, like replacing individual braincells with more durable mechanisms. Idk, maybe they would be cellular as well. …that makes me wonder, maybe it is possible to transfer consciousness even with traditional biological mechanism?

      • dev_null@lemmy.ml
        link
        fedilink
        arrow-up
        5
        ·
        6 months ago

        I was just annoyed at the protagonist for expecting anything else. The exact same thing already happened 2 times to the protagonist (initial copy at beginning of the game, then move to the other suit). Plus it’s reinforced in the found notes for good measure. So by the ending, the player knows exactly what’s going to happen and so should the protagonist, but somehow he’s surprised.

        • Azzk1kr@feddit.nl
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 months ago

          Yeah true. But Catherine said it perfectly at the end. Something like “you still don’t get it? What did you expect?”. The fact that one of his consciousness remains down in the abyss was kind of frightening. All by himself.

          • dev_null@lemmy.ml
            link
            fedilink
            arrow-up
            4
            ·
            6 months ago

            Two actually. The one from the before the suit change is also left there, and Catherine said he will wake up in a day or two. Maybe they can meet up actually.

    • highsight@programming.dev
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      6 months ago

      Ahh, but here’s the question. Who are you? The you who did the upload, or the you that got uploaded, retaining the memories of everything you did before the upload? Go on, flip that coin.

      • Kyrgizion@lemmy.world
        link
        fedilink
        arrow-up
        12
        ·
        6 months ago

        If you are the version doing the upload, you’re staying behind. The other “you” pops into existence feeling as if THEY are the original, so from their perspective, it’s as if they won the coin flip.

        But the original CANNOT win that coinflip…

          • Kyrgizion@lemmy.world
            link
            fedilink
            arrow-up
            9
            ·
            6 months ago

            I can’t speak for anyone else, but I would. The knowledge that “A” me is out there, somewhere, safe and sound, is uplifting, but it’s still quite chilling to realize you are staying wherever the hell you are. At least we die after enough time has passed because our bodies decay.

            onthulling

            The SOMA protagonist wasn’t that lucky…

            • 📛Maven@lemmy.sdf.org
              link
              fedilink
              English
              arrow-up
              4
              ·
              6 months ago

              Is it chilling? I was already going to stay where I am, whether I made a copy or not. Sharding off a replica to go on for me would be strictly better than not doing that

            • a lil bee 🐝@lemmy.world
              link
              fedilink
              arrow-up
              3
              ·
              6 months ago

              I think it’s both for me, which I think is what you might be saying as well. I would absolutely push the button to create the copy, or whatever, because I think I would derive satisfaction from creating a life (identical to mine, no less) that was free of the circumstance I was in, which must have been dire. However, I definitely don’t consider that instance “me” even if I do consider the copy a legitimate, separate version of “me”, so I don’t feel that I have perpetuated my own instance, leaving me in whatever fight-or-flight terror I was in to cause the scenario in the first place.

            • dev_null@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              6 months ago

              What do you mean he wasn’t so lucky, after all he lived out his live in Toronto. That he did a brain scan at some point of his life doesn’t matter. Sucks for the robot who thought he was him.

  • Digital Mark@lemmy.ml
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    1
    ·
    6 months ago

    It’s still a surviving working copy. “I” go away and reboot every time I fall asleep.

    • jkrtn@lemmy.ml
      link
      fedilink
      arrow-up
      6
      ·
      6 months ago

      Why would you want a simulation version? You will get saved at “well rested.” It will be an infinite loop of put to work for several hours and then deleted. You won’t even experience that much, your consciousness is gone.

  • Schmoo@slrpnk.net
    link
    fedilink
    arrow-up
    22
    ·
    6 months ago

    If anyone’s interested in a hard sci-fi show about uploading consciousness they should watch the animated series Pantheon. Not only does the technology feel realistic, but the way it’s created and used by big tech companies is uncomfortably real.

    The show got kinda screwed over on advertising and fell to obscurity because of streaming service fuck ups and region locking, and I can’t help but wonder if it’s at least partially because of its harsh criticisms of the tech industry.

  • Daxtron2@startrek.website
    link
    fedilink
    arrow-up
    21
    ·
    6 months ago

    Well yeah, if you passed a reference then once the original is destroyed it would be null. The real trick is to make a copy and destroy the original reference at the same time, that way it never knows it wasn’t the original.

  • Nobody@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    2
    ·
    edit-2
    6 months ago

    You see, with Effective Altruism, we’ll destroy the world around us to serve a small cadre of ubermensch tech bros, who will then somehow in the next few centuries go into space and put supercomputers on other planets that run simulations of people. You might actually be in one of those simulations right now, so be grateful.

    We are very smart and not just reckless, over-indulged douchebags who jerk off to the smell of our own farts.

  • waigl@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    6 months ago

    In a language that has exceptions, there is no good reason to return bool here…

  • Nate@programming.dev
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    6 months ago

    A value is trying to be set on a copy of a slice from a DataFrame. Try using .loc[row_indexer,col_indexer] = value instead See the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/indexing.html#indexing-view-versus-copy

  • EmoDuck@sh.itjust.works
    link
    fedilink
    arrow-up
    5
    ·
    6 months ago

    The Closest-Continuer schema is a theory of identity according to which identity through time is a function of appropriate weighted dimensions. A at time 1 and B at time 2 are the same just in case B is the closest continuer of A, according to a metric determined by continuity of the appropriate weighted dimensions.

    Lonk

    I don’t think that I fully agree with it but it’s interesting to think about

  • Clent@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    6 months ago

    It would be easier to record than upload. Since upload requires at least a decode steps. Given the fleeting nature of existence how does one confirm the decoding? This also requires we create a simulated brain, which seems more difficult and resource intensive than forming a new biological brain remotely connected to your nervous system inputs.

    Recording all inputs in real time and play them back across a blank nervous system will create an active copy. The inputs can be saved so they can be played back later in case of clone failure. As long as the inputs are record until the moment of death, the copy will be you minus the death so you wouldn’t be aware you’re a copy. Attach it to fresh body and off you go.

    Failure mode would take your literal lifetime to reform your consciousness but what’s a couple decades to an immortal.

    We already have the program to create new brains. It’s in our DNA. A true senior developer knows better than to try and replicate black box code that’s been executing fine. We don’t even understand consciousness enough to pretend we’re going to add new features so why waste the effort creating a parallel system of a black box.

    Scheduled reboots of a black box system is common practice. Why pretend we’re capable of skipping steps.