As the AI market continues to balloon, experts are warning that its VC-driven rise is eerily similar to that of the dot com bubble.

  • FaceDeer@kbin.social
    link
    fedilink
    arrow-up
    17
    arrow-down
    1
    ·
    1 year ago

    “Artificial intelligence” is well-established technical jargon that’s been in use by researchers for decades. There are scientific journals named “Artificial Intelligence” that are older than I am.

    If the general public is so confused they can come up with their own new name for it. Call them HALs or Skynets or whatever, and then they can rightly say “ChatGPT is not a Skynet” and maybe it’ll calm them down a little. Changing the name of the whole field of study is just not in the cards at this point.

    • pexavc@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      Never really understood the gatekeeping around the phrase “AI”. At the end of the day the general study itself is difficult to understand for the general public. So shouldn’t we actually be happy that it is a mainstream term? That it is educating people on these concepts, that they would otherwise ignore?

    • Orphie Baby@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      7
      ·
      edit-2
      1 year ago

      If you haven’t noticed, the people we’re arguing with— including the pope and James Cameron— are people who think this generative pseudo-AI and a Terminator are the same thing. But they’re not even remotely similar, or remotely-similarly capable. That’s the problem. If you want to call them both “AI”, that’s technically semantics. But as far as pragmatics goes, generative AI is not intelligent in any capacity; and calling it “AI” is one of the most confusion-causing things we’ve done in the last few decades, and it can eff off.

      • FaceDeer@kbin.social
        link
        fedilink
        arrow-up
        5
        arrow-down
        3
        ·
        1 year ago

        The researchers who called it AI were not the ones who are the source of the confusion. They’ve been using that term for this kind of thing for more than half a century.

        I think what’s happening here is that people are panicking, realizing that this new innovation is a threat to their jobs and to the things they had previously been told were supposed to be a source of unique human pride. They’ve been told their whole lives that machines can’t replace that special spark of human creativity, or empathy, or whatever else they’ve convinced themselves is what makes them indispensable. So they’re reduced to arguing that it’s just a “stochastic parrot”, it’s not “intelligent”, not really. It’s just mimicking intelligence somehow.

        Frankly, it doesn’t matter what they call it. If they want to call it a “stochastic parrot” that’s just mindlessly predicting words, that’s probably going to make them feel even worse when that mindless stochastic parrot is doing their job or has helped put out the most popular music or create the most popular TV show in a few years. But in the meantime it’s just kind of annoying how people are demanding that we stop using the term “artificial intelligence” for something that has been called that for decades by the people who actually create these things.

        Rather than give in to the ignorant panic-mongers, I think I’d rather push back a bit. Skynet is a kind of artificial intelligence. Not all artificial intelligences are skynets. It should be a simple concept to grasp.

        • Orphie Baby@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          3
          ·
          edit-2
          1 year ago

          You almost had a good argument until you started trying to tell us that it’s not just a parrot. It absolutely is a parrot. In order to have creativity, it needs to have knowledge. Not sapience, not consciousness, not even “intelligence” as we know it— just knowledge. But it doesn’t know anything. If it did, it wouldn’t put 7 fingers on a damn character. It doesn’t know that it’s looking at and creating fingers, they’re just fucking pixels to it. It saw pixel patterns, it created pixel patterns. It doesn’t know context to know when the patterns don’t add up. You have to understand this.

          So in the end, it turns out that if you draw something unique and purposeful, with unique context and meaning— and that is preeeetty easy— then you’ll still have a drawing job. If you’re drawing the same thing everyone else already did a million times, AI may be able to do that. If it can figure out how to not add 7 fingers and three feet.

          • FaceDeer@kbin.social
            link
            fedilink
            arrow-up
            3
            arrow-down
            2
            ·
            1 year ago

            As I said, call it a parrot if you want, denigrate its capabilities, really lean in to how dumb and mindless you think it is. That will just make things worse when it’s doing a better job than the humans who previously ran that call center you’re talking to for assistance with whatever, or when it’s got whatever sort of co-writer byline equivalent the studios end up developing to label AI participation on your favourite new TV show.

            How good are you at drawing hands? Hands are hard to draw, you know. And the latest AIs are actually getting pretty good at them.

            • ZagTheRaccoon@reddthat.com
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              1 year ago

              It is accurate to call it a parrot in the context of it essentially being used as ambiguated plagiarism machines to avoid paying workers.

              Yes it is capable of that. Yes that word means something else in the actual field. But you need to understand people are talking about this technology as it’s political relationships with power, and pretending prioritizing that form of analysis is well thats just people being uninformed about the REAL side and that’s their fault is yourself missing the point. This isn’t about pride and hurt feelings that a robot is doing something human do. It’s about the fact it’s a tool to undermine the entire value of the creative sector. And these big companies aren’t calling it AI because it’s an accurate descriptor. It could also be called a generative language model. They are calling it that because the common misunderstanding of the term is valuable to hype culture and VC investment. Like it or not, the average understanding of the term carries different weight than it does inside the field. And it turns the conversation into a pretty stupid one about sentience and humanity, as well as legitimizing the practice by trying to argue this is fundamentally unenforceable from the regulations we have on plagiarism, which it really isn’t.

              People who are trying to rebrand it aren’t doing it because they misunderstand the technical usage of the word AI. They are arguing the terminology is playing into the goals of our (hopefully shared) political enemies, who are trying to bulldoze a technology that they think should get special privileges: by implying the technology is something it isn’t. This is about optics and social power, and the term “AI” is contributing to further public misunderstand how it actually works, which is something we should oppose.

              • FaceDeer@kbin.social
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                1 year ago

                And these big companies aren’t calling it AI because it’s an accurate descriptor. It could also be called a generative language model.

                A generative language model is a kind of artificial intelligence. Similar to how a parrot is a kind of bird. They are calling it artificial intelligence because it is artificial intelligence, you’re the one who’s insisting on redefining a word that has been in use this way for many decades.

                ambiguated plagiarism machines

                That’s not how they work. Maybe learn a bit more about the field before telling the people working in it how to name things.

    • shy@reddthat.com
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      7
      ·
      1 year ago

      We should call them LLMAIs (la-mize, like llamas) to really specify what they are.

      And to their point, I think the ‘intelligence’ in the modern wave of AI is severely lacking. There is no reasoning or learning, just a brute force fuzzy training pass that remains fixed at a specific point in time, and only approximates what an intelligent actor would respond with through referencing massive amounts of “correct response” data. I’ve heard AGI being bandied about as the thing people really thought when you said AI a few years ago, but I’m kind of hoping the AI term stops being watered down with this nonsense. ML is ML, it’s wrong to say that it’s a subset of AI when AI has its own separate connotations.

      • FaceDeer@kbin.social
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        LLaMA models are already a common type of large language model.

        but I’m kind of hoping the AI term stops being watered down with this nonsense.

        I’m hoping people will stop mistaking AI for AGI and quit complaining about how it’s not doing what they imagined that they were promised it would do. I also want a pony.

        • shy@reddthat.com
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          4
          ·
          1 year ago

          You appear to have strong opinions on this, so probably not worth arguing further, but I disagree with you completely. If people are mistaking it then that is because the term is being used improperly, as the very language of the two words do not apply. AGI didn’t even gain traction as a term until recently, when people who were actually working on strong AI had to figure out a way to continue communicating about what they were doing, because AI had lost all of its original meaning.

          Also, LLaMA is one of the LLMAIs, not a “common type” of LLM. Pretty much confirms you don’t know what you’re talking about here…

          • FaceDeer@kbin.social
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            Also, LLaMA is one of the LLMAIs, not a “common type” of LLM. Pretty much confirms you don’t know what you’re talking about here…

            Take a look around Hugging Face, LLaMA models are everywhere. They’re a very popular base model because they’re small and have open licenses.

            You’re complaining about ambiguous terminology, and your proposal is to use LLMAIs (pronounce like llamas) as the general term for the thing that LLaMAs (pronounced llamas) are? That’s not particularly useful.