I equate it with doing those old formulas by hand in math class. If you don’t know what the formula does or how to use it, how do you expect to recall the right tool for the job?
Or in DND speak, it’s like trying to shoehorn intelligence into a wisdom roll.
Leaks weird fluids. Looks and feels like blood, but smells like lavender honey, possessed of a taste like unexpectedly cutting yourself on broken glass as you escape parental discipline to meet a lover.
Hasn’t screamed in a while, though. So that’s nice. I guess if i keep it satisfied, i have to explain a lot less to my neighbors.
I equate it with doing those old formulas by hand in math class. If you don’t know what the formula does or how to use it, how do you expect to recall the right tool for the job?
Or in DND speak, it’s like trying to shoehorn intelligence into a wisdom roll.
That would be fine if LLM was a precise tool like a calculator. My calculator doesn’t pretend to know answers to questions it doesn’t understand.
Mine just lies to me.
And tells me to kill.
Leaks weird fluids. Looks and feels like blood, but smells like lavender honey, possessed of a taste like unexpectedly cutting yourself on broken glass as you escape parental discipline to meet a lover.
Hasn’t screamed in a while, though. So that’s nice. I guess if i keep it satisfied, i have to explain a lot less to my neighbors.
the irony is that LLMs are basically just calculators, horrendously complex calculators that operate purely on statistics…