For real. Asked chatgpt to help with a batch script. Literally the easiest thing it could possibly program. And guess what? Almost every example it gave was wrong. It seemed to know a lot about the command line program I was trying to run but it didn’t really understand what it was repeating to me.
I once tried to use it to find the title of a book my wife enjoyed in her childhood based on her description of the plot. It described a possible book that could be written with that content and even suggested a title.
What gets me is video game info. It’s taking the info from a wiki, that is accurate; still manages to rearrange its output of factual information to turn it into bullshit that isn’t even remotely accurate.
Because it doesn’t understand any words at all. Everything is a web of probabilities that it can attach words to that are believable. That’s the magic.
They definitely have it working well enough that it can create coherent sentences that are entirely readable and understandable. Now if they could just make it so that the text it generates is actually factually accurate they might have a useful tool. I still wouldn’t call it intelligence though. Spicy auto-summarize still would not think and reason for itself.
For real. Asked chatgpt to help with a batch script. Literally the easiest thing it could possibly program. And guess what? Almost every example it gave was wrong. It seemed to know a lot about the command line program I was trying to run but it didn’t really understand what it was repeating to me.
"Oh I'm sorry! You're right mxzlptx.jar was deprecated in 2008. Here's the actual script that will definitely work 100%."
*INCORRECT BUZZER*
It kept fucking up double quotes and escape characters. Like, if it can’t get that right in a batch file it’s hopeless.
I like when you ask it to help, and it tells you to import a library that would solve your problem exactly, then you google it and it doesn’t exist.
I once tried to use it to find the title of a book my wife enjoyed in her childhood based on her description of the plot. It described a possible book that could be written with that content and even suggested a title.
Cool, thanks.
What gets me is video game info. It’s taking the info from a wiki, that is accurate; still manages to rearrange its output of factual information to turn it into bullshit that isn’t even remotely accurate.
Because it doesn’t understand any words at all. Everything is a web of probabilities that it can attach words to that are believable. That’s the magic.
Facts, reason, intelligence - Nope.
They definitely have it working well enough that it can create coherent sentences that are entirely readable and understandable. Now if they could just make it so that the text it generates is actually factually accurate they might have a useful tool. I still wouldn’t call it intelligence though. Spicy auto-summarize still would not think and reason for itself.
It’s so scary that the top line is an AI response now in google.
Lazy unaware people will just assume its correct,
Could be, but my (limited) experience with ChatGPT and PHP was quite different. This was in 2023, so it may have deteriorated since.
That version most likely burned the equivalent output of the sun to run it.