doodle967@lemdro.id to Privacy@lemmy.mlEnglish · 1 year agoSnowden: "They've gone full mask-off: do not ever trust OpenAI or its products"twitter.comexternal-linkmessage-square174linkfedilinkarrow-up1630arrow-down124file-text
arrow-up1606arrow-down1external-linkSnowden: "They've gone full mask-off: do not ever trust OpenAI or its products"twitter.comdoodle967@lemdro.id to Privacy@lemmy.mlEnglish · 1 year agomessage-square174linkfedilinkfile-text
minus-squareKnock_Knock_Lemmy_In@lemmy.worldlinkfedilinkarrow-up1·1 year ago You should have at least 16 GB of RAM available to run the 13B models, Is this gpu ram or cpu ram?
minus-squareKillingTimeItself@lemmy.dbzer0.comBannedlinkfedilinkEnglisharrow-up2·1 year agolikely GPU ram, there is some tech that can offload ram, but generally it’s all hosted in VRAM, this requirement will likely fade as NPUs start becoming a thing though.
minus-squareMalReynolds@slrpnk.netlinkfedilinkEnglisharrow-up1·1 year agoEither works, but system RAM is at least an order of magnitude slower, more play by mail than chat…
minus-squarereddithalation@sopuli.xyzlinkfedilinkarrow-up1·1 year agopretty sure it can run on either, but cpus are slow compared to gpus, often to the point of being impractical
Is this gpu ram or cpu ram?
likely GPU ram, there is some tech that can offload ram, but generally it’s all hosted in VRAM, this requirement will likely fade as NPUs start becoming a thing though.
Either works, but system RAM is at least an order of magnitude slower, more play by mail than chat…
pretty sure it can run on either, but cpus are slow compared to gpus, often to the point of being impractical