• Flames5123@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    1 day ago

    I asked my work’s AI to just give me a comma separated list of string that I gave it, then it returned a list of strings with all the strings being “CREDIT_DEBIT_CARD_NUMBER”. The numbers were 12 digits, not 16. I asked 3 times to give me the raw numbers and had to say exactly “these are 12 digits long not 16. Stop obfuscating it” before it gave me the right things.

    I’ve even had it be wrong about simple math. It’s just awful.

    • catloaf@lemm.ee
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      edit-2
      1 day ago

      Yeah because it’s a text generator. You’re using the wrong tool for the job.

      • Flames5123@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        22 hours ago

        Exactly. But they tout this as “AI” instead of an LLM. I need to improve my kinda ok regex skills. They’re already better than almost anyone else on my team, but I can improve them.

    • orca@orcas.enjoying.yachts
      link
      fedilink
      arrow-up
      8
      ·
      1 day ago

      It’s really crappy at trying to address its own mistakes. I find that it will get into an infinite error loop where it hops between 2-4 answers, none of which are correct. Sometimes it helps to explicitly instruct it to format the data provided and not edit it in any way, but I still get paranoid.

    • kameecoding@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      2
      ·
      edit-2
      20 hours ago

      Either you are bad at chatgpt, or I am a machine whisperer but I have a hard time believing copilot couldnt handle that, I am regularly having it rewrite sql code, reformatting java code, etc