• FormallyKnown
    link
    fedilink
    English
    arrow-up
    3
    ·
    12 days ago

    A local LLM not using llama.cpp as the backend? Daring today aren’t we.

    Wonder what its performance is in comparison