Couldn’t make this shit up if I tried.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    14 days ago

    Dual 3060s are an option. LLMs can be split across GPUs reasonably well.

    3090s used to be like $700 used, but ironically they’ve gone up in price. I got mine for around $800 awhile ago, and stuffed it into 10L PC.

    Some people buy used P40s. There are rumors of a 24GB Arc B580. Also, AMD Strix Halo APU laptops/mini PCs can host it quite well, with the right software setup… I might buy an ITX board if anyone ever makes one.

    Also, there are 12GB/6GB VRAM distillations too, but 24GB is a huge intelligence step-up.