Just why. This was like the third time it reminded me that I shouldn’t have the phone I have.
I really don’t understand the public desire to make these AIs like this.
Just why. This was like the third time it reminded me that I shouldn’t have the phone I have.
I really don’t understand the public desire to make these AIs like this.
Can we run our own version of these llms?
Check out Llama cpp sometime, it’s foss and you can run it without too crazy system requirements. There are different sets of parameters you can use for it, I think the 13 billion parameters set only uses 8 gigs of ram and the 25b parameter set uses 16gb. Definitely nowhere near as good as gpt, but still fun and hopefully will improve in the future.
Yes, there are self hosted llm
Check out !selfhosted@lemmy.world.
I hope that link worked.
It worked. Thanks!