Alpaca app for artificial intelligence

I had something about this in How-to: Installing AI to L5 and running it locally offline with ollama.ai The gist of it is that L5 has limited resources, so you want to select a model that is small in size - and it still will be slow. I recommend trying tinyllama first (purely from size standpoint), or anything around 1Gb (you can probably take it to 2 to 3Gb but I’m not sure when L5 simply runs out of steam - usually the memory is the limit with sizes).

Then, there are different models for different uses and you can find some model card databases that give some description about how they were made and for what (and what license they use and how open they are).
Ollama list is (see also the tab for variations of models): library
More in depth info of models from Huggingface’s database (where Ollama gets them mostly): Models - Hugging Face

Btw. using Alpaca GUI from flathub may add some overhead to your L5, so you may have to use a tad smaller models than what Ollama is able from command prompt uses (haven’t compared them yet). That being said, Alpaca is probably more user friendly - by far.

2 Likes