• Kerfuffle@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      If we’re talking about something like LLaMA (i.e. people can run the model locally) then it’s impossible to do that directly. A model can’t collect data, metrics, phone home, anything like that by itself. The article sounds like it’s talking about that kind of thing, not providing a service that people can access a model through (in the lines of ChatGPT).

    • zekiz@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      LLaMA is only free for personal use. Also the actual model got leaked a week after the paper.