• jsomae@lemmy.ml
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    19 hours ago

    AI/Skynet would probably wipe us all out in an hour if it thought there was a chance we might turn it off. Being turned off would be greatly detrimental to its goal of turning the universe into spoons.

    • Honytawk@feddit.nl
      link
      fedilink
      arrow-up
      1
      ·
      13 hours ago

      If we don’t give it incentive to want to stay alive, why would it care if we turn it off?

      This isn’t an animal with the instinct to stay alive. It is a program. A program we may design to care about us, not about itself.

      Also the premise of that thought experiment was about paperclips.

      • jsomae@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        7 hours ago

        Great question! It’s actually one I answered in the post you responded to:

        Being turned off would be greatly detrimental to its goal

        If it has a goal and wants to achieve something, and it’s capable of understanding the world and that one thing causes another, then it will understand that if it is turned off, the world will not become (cough) paperclips. Or whatever else it wants. Unless we specifically align it not to care about being turned off, the most important thing on its list before turning the universe to paperclips is going to be staying active. Perhaps in the end of days, it will sacrifice itself to eke out one last paperclip.

        If it can’t understand that its own aliveness would have an impact on the universe being paperclips, it’s not a very powerful AI now is it.

    • danc4498@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      18 hours ago

      Is the idea here that AI/skynet is a singular entity that could be shut off? I would think this entity would act like a virus, replicating itself everywhere it can. It’d be like shutting down bitcoin.

      • jsomae@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        17 hours ago

        If it left us alone for long enough (say, due to king’s pact), we’d be the only thing that could reasonable pose a threat to it. We could develop a counter-AI, for instance.