• fmstrat@lemmy.nowsci.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 hours ago

    To anyone complaining about non-replaceable RAM: This machine is for AI, that is why.

    Think of it like a GPU wirh a CPU on the side, vs the other way around.

    Inference requires very fast ram transfer speed, and that is only possible (currently) on soldered buses. Even this is pretty slow at 256Gb/s, but it’s RAM size of 96GB to GPU makes it interesting for larger models.