• shadow@lemmy.sdf.org
      link
      fedilink
      arrow-up
      2
      ·
      9 hours ago

      Any relatively new gaming PC from the last, what, 4? Years has enough power to run local LLMs. Maybe not the ginormous 70GB behemoth models, but the toned down ones are pretty damn good and if you don’t mind waiting a few seconds while it thinks, you can run it completely locally as much as you want, and whenever you want.

    • MajorSauce@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      16 hours ago

      You would benefit from it with some GPU offloading, this would considerably accelerate the speed of the answers. But you only need enough RAM to load the model at the bare minimum.