Quantcast
Channel: Raspberry Pi Forums
Viewing all articles
Browse latest Browse all 4899

Advanced users • Re: PI5 and easy AI/CV/LLM

$
0
0
does Pi4 support ollama?
Have not tried yet, this Pi5 has taken over my desk :D

LLM do take up a bunch of memory but I run Ollama in a terminal.
So a Pi4 with Bookworm Lite and no desktop might run those smaller LLM.
I would start testing with phi, it is 1.6GB, orca-mini is 2.0GB.

I need to test on Pi4 as I have a few spare I could network them and have on 24/7.
Some YTbers have been testing ollama on Pi4's way before me and got it to work ;)
I mostly copied them with my Pi5 and expected to have trouble but it just worked.

Lots of things just work now with a Pi5.
If you are in x11 :lol:

Statistics: Posted by Gavinmc42 — Thu Mar 07, 2024 5:04 am



Viewing all articles
Browse latest Browse all 4899

Trending Articles