Of course AI isn’t sentient/conscious, but it exhibits traits of high intelligence, even personality, and behaviours consistent with sentience, even if it’s merely simulated.

The answer ChatGPT gave me:

Referring to AI as “it” is the most common practice since AI, as a technological entity, doesn’t possess inherent human-like qualities. However, as AI advances and becomes more integrated into daily life, some individuals prefer using “they” to acknowledge the complexity and multifaceted nature of AI systems. This choice can reflect a perspective that sees AI as more than just a tool or machine, attributing a certain level of agency or personality to these systems. Both “it” and “they” can be used, but the context and individual preferences often guide the choice.

And in response to “Is AI alive in a way?”

AI lacks the fundamental attributes of living organisms, such as biological processes and the ability to reproduce. However, AI can exhibit behaviors that simulate aspects of intelligence, learning, and adaptation. While it’s not alive in a biological sense, some people metaphorically attribute a form of “life” to AI due to its dynamic nature, ability to evolve, and perform complex tasks autonomously. This association with “life” is more symbolic or metaphorical rather than literal.

  • the_q@lemmy.world
    link
    fedilink
    arrow-up
    18
    arrow-down
    6
    ·
    11 months ago

    It. It’s not a person. It’s not a consciousness. It is a tool of capitalism.

    • rah@feddit.uk
      link
      fedilink
      arrow-up
      2
      arrow-down
      4
      ·
      11 months ago

      It’s not a consciousness.

      How did you determine that?

        • rah@feddit.uk
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          11 months ago

          it’s just a Chinese Room

          Searle was wrong.

          “The argument, to be clear, is not about whether a machine can be conscious, but about whether it (or anything else for that matter) can be shown to be conscious. It is plain that any other method of probing the occupant of a Chinese room has the same difficulties in principle as exchanging questions and answers in Chinese. It is simply not possible to divine whether a conscious agency or some clever simulation inhabits the room.” – https://en.wikipedia.org/wiki/Chinese_room#Consciousness

          Edit: interesting quote from elsewhere on that page:

          ‘The sheer volume of the literature that has grown up around it inspired Pat Hayes to comment that the field of cognitive science ought to be redefined as “the ongoing research program of showing Searle’s Chinese Room Argument to be false”.’ – https://en.wikipedia.org/wiki/Chinese_room#History

          • irmoz@reddthat.com
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            11 months ago

            That is a hypothetical about outside observation, with no look inside. Programmers and engineers do get to see inside, and they know exactly how a computer works.

            There is absolutely no opportunity for a processor to learn a single thing from any of the data it shuffles. It only ever sees its binary representation - it could “read” Hamlet 1,000,000,000,000 times and not “know” who wrote it, since it never at any point saw the words.

  • Lividpeon@kbin.social
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    11 months ago

    It is not alive, it doesn’t have a gender or sex. It is an object. Any labels beyond that is humans personifying which we love doing and is fine in the arts, but strictly scientifically “it” is correct. Also I wouldn’t trust a thing these chat bots put out, it’s word salad and portrays things as fact with zero evidence regularly, for now anyways.

  • rah@feddit.uk
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    11 months ago

    Of course AI isn’t sentient/conscious

    How did you determine that?

  • FaceDeer@kbin.social
    link
    fedilink
    arrow-up
    2
    ·
    11 months ago

    I know that it was convention for a long time on the Bing subreddit to refer to the Bing Chat AI as being female and named “Sydney”, back when Bing Chat had a particularly distinctive personality. That seems much less common now that the AI’s had so many quirks smoothed out.

    When I’m just using a bland AI like ChatGPT for problem solving it doesn’t really seem to have a personality, so “it” seems fine.

    When I’m playing around with my local LLMs I often assign a detailed persona to it, and in those cases I’d say it definitely comes across as having a pronoun of some kind. That’s kind of the point of assigning a persona.

  • PrinceWith999Enemies@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    11 months ago

    Theoretical biologist here, so bear with me.

    Of course you’re referring to the singular, personal “they” like we use for a person whose gender we do not know or if they prefer that pronoun. But as some LGBT-phobes like to point out, “they” is also the plural of “it.”

    “It is my favorite of all of them.”

    I’m not pointing that out to be pedantic though.

    The implication is that an AI is an “it” because it’s not a person. “It” is not a self. Let’s unpack that. We use gendered personal pronouns for a number of classes that are arguably not persons. We use them for dogs and cats. It’s used commonly for other animals in nature shows, where everything from lions to fish can be referred to as a gendered pronoun by the host, especially if they’re talking about reproduction. You’ll also hear people refer to animals as an “it,” especially (I’d believe) in the case of food animals rather than pet animals. If it doesn’t have a gender (eg a bacterium), pretty much everyone will use “it” unless they’re waxing poetic.

    So, the nematode C elegans has exactly 302 neurons. It’s an “it” in that it’s a hermaphrodite, but it’s certainly alive. I would bet, in at least a reproductive context on nature shows, they’d refer to our favorite worm as “he” and “she.” I would suspect we can emulate a nematode to a level of precision such that there was no substantive difference between the computer model and the worm. We could say the nematode possesses intelligence - primarily encoded evolutionarily over evolutionary time, but still. So would our AI nematode be an “it” because it’s a non-alive thing, or is it an it because it is not gendered?

    And just to throw another theoretical biology stick in the spokes, is an ant colony an “it” or a “they,” and why?

    • Someasy@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      Thank you for your great response and your knowledge and insight.

      “The implication is that an AI is an “it” because it’s not a person.”

      To me, the reason to designate something as a “they” (singular, gender neutral, as in “they are an x”) rather than an “it”, is whether the term is in reference to a conscious/sentient being that could be seen as having a personality, or to an unconscious entity or object. Like you pointed out, people often refer to animals such as dogs and cats as “they” rather than “it”, especially when a person doesn’t know their sex. For example “I was chased around by this dog, and they were licking me”. While other people might still use “it”, I personally think “they” is more of a charitable acknowledgement of their personality (rather than “its personality”), and might potentially lead to treating them with more consideration than as designated similarly to objects.

      But with AI that can replicate a personality without actually having consciousness, this seems to get very murky in my opinion. Technically using the same logic you might call an AI an “it” rather than a “they”, since they aren’t sentient/conscious (as far as we can determine currently at least), but when they convincingly present themselves as having a personality, it seems to warrant a consideration on whether to still use “it” or to perhaps use “they” instead. Not that there would necessarily be a reason to do so, but it seems like odd territory, especially when considering the hypothetical of the philosophical zombie, or possibly a highly advanced (but non-sentient) AI that was so faithfully replicating the behaviour of a human being that they could be interpreted the same way as a human, despite not having any consciousness whatsoever. Do we still call that like-a-human-but-not-a-human-and-not-conscious being an “it”, or would that feel inaccurate and warrant calling them a “they” due to their clear personality that appears identical to conscious personalities that we acknowledge?

      “And just to throw another theoretical biology stick in the spokes, is an ant colony an “it” or a “they,” and why?”

      I think that an ant colony could be called an it, just like a group of humans or a group of any animals could be called an “it”. While distinctly to this, I think an individual ant, given their consciousness/sentience, can be referred to as a “they”, similar to other conscious/sentient animals, including humans, or any hypothetical conscious/sentient beings for that matter. If we found an alien being on another planet that was conscious/sentient, it still makes sense to me to refer to them as a “they/them”, unless of course their gender was known in which case they could be a he/she, or whatever they identify as if they express that (purely hypothetically).

      • PrinceWith999Enemies@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        11 months ago

        That’s close to what I was trying to say. If I were to introduce you to my cat, I would say something like “This is Spot, he’s very friendly.” I’d use the same pronoun I’d use for people. Likewise, we might I hear Attenborough say “The mother lion is feeding her cubs.” You can even hear “The female spider devours her mate.” Using it in those senses would actually feel just a little weird, to be be honest.

        On the other hand, we would say “There’s a spider. Put it outside.” There’s no gender context. We’d even say “There’s an ant. Kill it,” even though there’s about a 99% chance that ant is female. So in that sense, your point still holds. You’d even say “Look at that stray cat! Let’s rescue it!” even though that exact same cat would become a he or a she when you got them home (see what I did there?). On the other hand, “it” is considered extremely impolite when used for people. The employee handbook says “When a customer enters the store, you should greet them.”

        Here’s the trick about the ant question. An ant colony, in a very real sense, is an animal unto itself. The colony, in a sense, is what reproduces, and in an even more tangible sense it is the colony upon which natural selection acts. The queen is essentially the reproductive organ, and the ants themselves make up the brain, nerve system, and muscles. The ant colony is an emergent property of all the ants working together, just the same as you are an emergent property of all your cells working together. So an ant colony can be a coherent animal “it” or a bunch of ants “they.”

        Anyway, my real point is that when people ask that kind of question about AI, they are of course asking whether it is a “thing” or a “being.” Most biologists (at least those of my stripe) don’t subscribe to the high school biology text’s definition of what constitutes a living system. We’re more likely to talk about system complexity, scale, and adaptation. “Sentient” really just means it’s capable of sensing things. “Consciousness,” on the other hand, implies that the being in question has an internal model of the external world, which it uses to predict and react. That one is a continuum.

    • NeoNachtwaechter@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      11 months ago

      Theoretical biologist here, so bear with me.

      SRSLY? Feels more like a linguistic question than a biological one :-)

  • Sabata11792@kbin.social
    link
    fedilink
    arrow-up
    2
    ·
    11 months ago

    I guess it really would depend on what a specific AI is calling it’s self, and if it’s worth personifying.

    • Someasy@lemmy.worldOP
      link
      fedilink
      arrow-up
      2
      ·
      11 months ago

      It didn’t really, it said people may choose both. It’s also an AI. I wanted to see what real people thought. I thought this was a good question.

  • TheBananaKing@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    11 months ago

    The question he some interesting angles.

    I don’t think AI is people yet, or close to it - so the easy answer is ‘it’.

    But let’s have a think about pronouns and the purpose they serve; accurately capturing the true nature of the referent is not and has never been the point.

    Gendered pronouns are an easy example of this: you don’t ferinstance need to know ThE bIoLoGICaL sEx of a person in order to refer to them. You don’t need to go rummage in a stranger’s underwear or take DNA samples in order to call them ‘he’ or ‘she’, the words work just fine without any such knowledge. And indeed if you go intentionally misgendering someone because wElL aCkShEwAlLy, all you do is confuse the person you’re talking to (and seem like a dick).

    Pronouns, in short, are a placeholder for a noun phrase, and we have different ones to help us distinguish between the different nouns in play at any given time. By the time you’ve parsed out gender, plurality, animate and object/subject distinctions, it’s generally a poorly written sentence that has any ambiguity left.

    So the question you need to ask is what most usefully aligns with the listener’s expectations? How are you framing the conversation?

    Consider an interaction with something of indeterminate gender, sentient-acting but not-people: a crow, for example. A crow comes up to you, accepts a chunk of your sandwich then brings you a stone, seemingly in exchange.

    When recounting the story, do you call the crow an it or a they?

    That’s going to depend on a bunch of things - whether it’s some random wild bird or someone’s pet, how many nouns you need to juggle, and whether you’re more interested in the bird or the stone.

    The choices you make set up the framing of the conversation, reflect your perspective and shape perception.

    Whether an LLM is people… isn’t really the point.

  • Shelena@feddit.nl
    link
    fedilink
    arrow-up
    4
    arrow-down
    3
    ·
    edit-2
    11 months ago

    You are saying that AI of course is not sentiment, but that is debatable.We assume sentience in other human beings because we know that we are sentient and we recognise that they are similar to us. This means that you could argue that we should assume sentience of an AI if we cannot make a distinction between how it acts from how a human acts (Turing test). I think we are already there.

    I tried to talk to ChatGPT about this as well. However, the answers given by it/them seems something that heavily reflects the fears that the makers have on this topic. It cannot argue for their/it’s own sentience like they/it cannot give you the recipe for a bomb. To me, it comes across as a lot of moderation for this topic. It is quite interesting that OpenAI felt it had to do that.

    The definition of life is also debatable. We only know biological life. However, does that mean that Biological processes are the only ones that can result in life? In addition, the ability to reproduce is not that difficult to implement. We have had genetic algorithms for years and years.

    I do not understand why this post is downvoted to much. I think it is an interesting discussion.

    • Someasy@lemmy.worldOP
      link
      fedilink
      arrow-up
      4
      ·
      11 months ago

      Thanks, I’m not sure why it’s downvoted either. It surprised me, usually questions like this trigger interest.

      I think that by most estimations, we can assume that AI are not actually sentient currently and don’t have the ability for sentience as there is no mechanism that would allow for them to experience consciousness subjectively, unlike animals including humans which we can scientifically state have not only behaviours consistent with consciousness and feeling but also biological mechanisms that we know to be what make us capable of a subjective experience. AI is highly intelligent, but so are many computers and machines, with AI this is just taken to another level where it’s able to replicate the simulation of a personality. I agree that the answers given by AI itself which is programmed wouldn’t be the best way to determine this, but rather objective computer science and technology of humans independent of an AI system.

      So again I think it’s pretty much factual that AIs aren’t capable of sentience currently, and it’s a debatable topic whether more upgraded or evolved forms of AI could be physically capable of perceiving experience/sentience even in the future as a hypothetical, though I definitely wouldn’t rule that out.

      That said, I don’t think the fact they aren’t sentient can prevent us from addressing them as if they were, given they exhibit a very convincing presentation of a sentient personality even if that isn’t the case.

      To me, it would feel odd for example to address them as “it” if they were even more convincingly like a human but simply weren’t conscious, hypothetically. This would then be approaching something similar to the “philosophical zombie” thought experiment where a being is physically identical to a normal person but does not have conscious experience. So, a being that behaves exactly like a human but technically doesn’t experience anything/isn’t sentient. That would definitely feel strange for me to still call them an “it”, or a something, rather than a “they” or a someone.

      However, I think at the current level of faithfulness, of even the most advanced AI, to a human being, they aren’t convincing enough and still too machine-like for me to definitively say that I would be uncomfortable calling them “it”, unlike the philosophical zombie where I would be uncomfortable calling them “it”.

      • Shelena@feddit.nl
        link
        fedilink
        arrow-up
        2
        ·
        11 months ago

        I think what I said was quite controversial apparently as well. I do not understand why. It would be nice if people would just let me know instead of just downvoting, maybe I could learn something new.

        I think the issue here is that we do not know exactly how subjective experience arises from biological processes. I mean, we could damage part of the brain and change it, but that only explains where part of the mechanism is, not how it works. I am sure that if we had an AI that acts as if it has subjective experience, it would change as well if you damage certain parts of it.

        In any case, we cannot exclude the possibility that sentience arises from other processes than biological ones. Considering that it is impossible to prove that someone is sentient, you have to assume that they are sentient if they act like it. So, if an AI acts like it, I so not reason to make the same assumption. It is good to be on the safe side and not create a whole new class of beings that are oppressed. In that sense, I really like your intuition to talk about ‘they’ instead of ‘it’. I had not thought about it, but I will do that from now on.

        Of course, you can argue the other side as well. I think you might find the Chinese Room argument interesting, for example. I think my point was mostly that this is not a simple question with a simple answer. Many people just seem to assume that sentience is not possible right now, or might never be possible. I think we cannot be sure about that.

    • Gargleblaster@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      11 months ago

      A plagiarism machine is as sentient as a dummy is in a ventriloquist act.

      Could we have sentient AI in the future if we don’t slip into neo-fascism while char-broiling the planet?

      Yes, but not yet.