minus-squaretehnomad@alien.topBtoSelf-Hosted Main@selfhosted.forum•Best GPUs for self-hosted AI?linkfedilinkEnglisharrow-up1·11 months agoThe best consumer NVIDIA card is the 3090ti because of its 24GB memory, so you can run bigger LLM models. I have a 3060ti 12GB which works pretty well with 7B and 13B LLM models. linkfedilink
The best consumer NVIDIA card is the 3090ti because of its 24GB memory, so you can run bigger LLM models. I have a 3060ti 12GB which works pretty well with 7B and 13B LLM models.