• farsinuce
    link
    fedilink
    arrow-up
    3
    ·
    6 hours ago

    OLMo 2 32B is the first fully-open model to outperform GPT3.5-Turbo and GPT-4o mini on a suite of popular, multi-skill academic benchmarks.

    Så dens ydeevne er vel ~1 år bagud ift. de proprietære state-of-the-art modeller. Ikke ringe for en komplet åben model.

    Kilde: https://allenai.org/blog/olmo2-32B