Have not tried yet, this Pi5 has taken over my deskdoes Pi4 support ollama?

LLM do take up a bunch of memory but I run Ollama in a terminal.
So a Pi4 with Bookworm Lite and no desktop might run those smaller LLM.
I would start testing with phi, it is 1.6GB, orca-mini is 2.0GB.
I need to test on Pi4 as I have a few spare I could network them and have on 24/7.
Some YTbers have been testing ollama on Pi4's way before me and got it to work

I mostly copied them with my Pi5 and expected to have trouble but it just worked.
Lots of things just work now with a Pi5.
If you are in x11

Statistics: Posted by Gavinmc42 — Thu Mar 07, 2024 5:04 am