• Bob Robertson IX@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    1 day ago

    Unfortunately I think most businesses will still prefer that their AI solution is hosted by a company like OpenAI rather than maintaining their own. There’s still going to be a need for these large data centers, but I do hope most people realize that hosting your own LLM isn’t that difficult, and it doesn’t cost you your privacy.

    • Kongar@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      22 hours ago

      The cost is insane though. I think there’s a disconnect between what they want and what they can afford. I think it’s like a 10x adder per user license to go from regular office 356 to a copilot enabled account. I know my company wants it hosted in the cloud - but we aren’t going to pay the going rates. It’s insane.

      Meh we’ll see. But I do wonder what happens when they get packaged up easier as a program.

      • Bob Robertson IX@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        19 hours ago

        Anyone running a newer MacBook Pro can install Ollama and run it with just a few commands:

        brew install ollama

        ollama run deepseek-r1:14b

        Then you can use it at the terminal, but it also has API access, so with a couple more commands you can put a web front end on it, or with a bit more effort you can add it to a new or existing app/service/system.