How to Run a Local LLM in Linux

I am using LMDE6 and a Nvidia RTX 5000 16gb to run various Local LLM A.I. models.  I have been following this guide with good success: How to Run a Local LLM with Ubuntu 

Other links:

https://ollama.com/library/phi4
https://www.baeldung.com/linux/genai-ollama-installation
https://community.aws/content/2eojjD2E7TBgPFJmB2FGAtrSSBh/the-rise-of-the-llm-os-from-aios-to-memgpt-and-beyond?lang=en

Leave a Reply

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>