SurpriZe@lemm.ee to Asklemmy@lemmy.ml · 28 days agoWhat model do you use in your GPT4all?message-squaremessage-square6fedilinkarrow-up134arrow-down17file-text
arrow-up127arrow-down1message-squareWhat model do you use in your GPT4all?SurpriZe@lemm.ee to Asklemmy@lemmy.ml · 28 days agomessage-square6fedilinkfile-text
Curious about what model is best to use on my RTX 3080 + Ryzen 5 3600 since I’ve just found out about this.
minus-squaregeneva_convenience@lemmy.mllinkfedilinkarrow-up3·27 days agoLlama3.1 8b,the other versions are too big to run on gpu
Llama3.1 8b,the other versions are too big to run on gpu