• Domi@lemmy.secnd.me
    link
    fedilink
    arrow-up
    4
    ·
    13 hours ago

    Hosting a model of that size requires ~800GB of VRAM. Even if they release their models, it wouldn’t make them obsolete since most people and many companies couldn’t host it either way.

    • rcbrk@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 hour ago

      Anyone can now provide that service. Why pay OpenAI when you can pay a different service who is cheaper or provides a service more aligned with your needs or ethics or legal requirements?