• Rose@slrpnk.net
    link
    fedilink
    arrow-up
    14
    ·
    4 hours ago

    Working on it

    🤣, and I do not use this emoji lightly. See, Elon - your memes are cringe, so you should focus on jokes like these instead. Veritable kneeslappers. Suggesting you’re actually doing something personally? Hah, hilarious.

    • ByteOnBikes@discuss.onlineOP
      link
      fedilink
      arrow-up
      1
      ·
      10 minutes ago

      It’s trained on “Legacy media”, filled with things like data and facts. NEW media is based on thoughts, feelings, anti-wokeness, and flipping around if “Pedo Trump and Epstein were besties” or if “Daddy Trump I’m sorry I’m sorry I’m sorry”.

  • ZeroOne@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    4 hours ago

    So Elon has lost control of Grok😂

    Hey do you guys think AIs will be benevolent towards us ?

  • turdburglar@lemmy.zip
    link
    fedilink
    arrow-up
    35
    ·
    8 hours ago

    grok is also running off generators in south memphis, polluting the air for the humans that live near elon’s penis computer.

    elon is also using the clean drinking water of the memphis sand aquifer to cool his penis computer rather than using grey water from the plant that he promised to build but has yet to break ground on.

    please do not use grok. it is literally poisoning the people of memphis.

  • nucleative@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    ·
    8 hours ago

    Working on it means he forwarded a screenshot to somebody who works for him with a bunch of ???

    Meanwhile, depending on office politics, that guy will unfortunately have to spend the next 3 months figuring out how to alter the facts or just suppress data made by the AI that the boss doesn’t like.

  • jjjalljs@ttrpg.network
    link
    fedilink
    arrow-up
    60
    ·
    12 hours ago

    I feel like shithead musk always says like “working on it” but i would be surprised if he ever does any meaningful work.

    He’s just such a disgrace. Like a fractal shit. No matter what part of him and his life you look at, no matter how zoomed in or out, it’s just shit. Fuck that guy.

    • AreaKode@lemmy.world
      link
      fedilink
      arrow-up
      38
      arrow-down
      13
      ·
      12 hours ago

      That’s what I love about LLMs. They aren’t intelligent. They’re just really good at recognizing patterns. That’s why objective facts are always presented correctly. Most of the pattern points at the truth. To avoid this, they will have to add specific prompts to lie about this exact scenario. The next similar fact, they’ll have to manually code around that one too. LLMs are very good at finding the overwhelming truth.

      • stoy@lemmy.zip
        link
        fedilink
        arrow-up
        16
        arrow-down
        1
        ·
        9 hours ago

        I asked ChatGPT to describe the abandoned railway line between Åkersberga and Rimbo, it responded with a list of stations and descriptions and explained the lack of photos and limited information as due to the stations being small and only open for a short while.

        My explanation is that there has never been a railway line between Åkersberga and Rimbo directly, and that ChatGPT was just lying.

        • lime!@feddit.nu
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          7 hours ago

          it’s not lying, because it doesn’t know truth. it just knows that text like that is statistically likely to be followed by text like this. any assumptions made by the prompt (e.g. there is an old railway line) are just taken at face value.

          also, since there has indeed been a railway connection between them, just not direct, that may have been part of the assumption.

      • ZDL@lazysoci.al
        link
        fedilink
        arrow-up
        43
        ·
        12 hours ago

        That’s why objective facts are always presented correctly.

        Here’s me looking at the hallucinated discography of a band that never existed and nodding along.

        • Honytawk@feddit.nl
          link
          fedilink
          arrow-up
          2
          arrow-down
          2
          ·
          edit-2
          4 hours ago

          There are no objective facts about a band that never existed, that is the point.

          Ask them about things that do have enough overwhelming information, and you will see it will be much more correct.

          • ZDL@lazysoci.al
            link
            fedilink
            arrow-up
            4
            ·
            4 hours ago

            But not 100%. And the things they hallucinate can be very subtle. That’s the problem.

            If they are asked about a band that does not exist, to be useful they should be saying “I’m sorry, I know nothing about this”. Instead they MAKE UP A BAND, ITS MEMBERSHIP, ITS DISCOGRAPHY, etc. etc. etc.

            But sure, let’s play your game.

            All of the information on Infected Rain is out there, including their lyrics. So is all of the information on Jim Thirwell’s various “Foetus” projects. Including lyrics.

            Yet ChatGPT, DeepSeek, and Claude will all three hallucinate tracks, or misattribute them, or hallucinate lyrics that don’t exist to show parallels in the respective bands’ musical themes.

            So there’s your objective facts, readily available, that LLMbeciles are still completely and utterly fucking useless for.

            So they’re useless if you ask about things that don’t exist and will hallucinate them into existence on your screen.

            And they’re useless if you ask about things that do exist, hallucinating attributes that don’t exist onto them.

            They. Are. Fucking. Useless.

            That people are looking at these things and saying “wow, this is so accurate” terrifies the living fuck out of me because it means I’m surrounded not by idiots, but by zombies. Literally thoughtless mobile creatures.

          • ZDL@lazysoci.al
            link
            fedilink
            arrow-up
            7
            ·
            10 hours ago

            I made the band up to see if LLMbeciles could spot that this is not a real band.

            Feel free to look up the band 凤凰血, though, and tell me how “underground” it is.

            • agamemnonymous@sh.itjust.works
              link
              fedilink
              arrow-up
              1
              arrow-down
              2
              ·
              edit-2
              9 hours ago

              Does this count?

              Also, by nature of being underground they would be difficult to look up. Some bands have no media presence, not even a Bandcamp or a SoundCloud.

              • ZDL@lazysoci.al
                link
                fedilink
                arrow-up
                3
                ·
                6 hours ago

                Nope.

                You can tell because they’re not even in the same writing system. Future tip there.

              • smiletolerantly@awful.systems
                link
                fedilink
                arrow-up
                3
                ·
                8 hours ago

                Are you having this argument on the principle of defending the undergrounded-ness of bands, or do you actually believe LLMs always get the facts straight?

                • agamemnonymous@sh.itjust.works
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  7 hours ago

                  Eh, more of an exercise in scientific skepticism. It’s, possible that an obscure band with that name was mentioned deep in some training data that’s not going to come up in a search. LLMs certainly hallucinate, but not always.

      • shalafi@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        2
        ·
        11 hours ago

        I use ChatGPT once a day or so. Yeah, it’s damned good at simple facts, more than lemmy will ever admit. Yeah, it’ll easily make shit up if there’s no answer to be had.

        We should have started teaching tech literacy and objective analysis 20-years ago. FFS, by 2000 I had figured out that, “If it sounds like bullshit, it likely is. Look more.”

        Also, after that post, I’m surprised this site hasn’t taken you out back and done an ol’ Yeller on ya. :)