AbsoluteSix
  • Communities
  • Create Post
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
doodle967@lemdro.id to Privacy@lemmy.mlEnglish · 11 months ago

Snowden: "They've gone full mask-off: do not ever trust OpenAI or its products"

twitter.com

external-link
message-square
174
link
fedilink
606
external-link

Snowden: "They've gone full mask-off: do not ever trust OpenAI or its products"

twitter.com

doodle967@lemdro.id to Privacy@lemmy.mlEnglish · 11 months ago
message-square
174
link
fedilink
x.com
twitter.com
external-link

cross-posted from: https://lemmy.smeargle.fans/post/182373

HN Discussion

  • classic@fedia.io
    link
    fedilink
    arrow-up
    14
    ·
    11 months ago

    Is there a magazine or site that breaks this down for the less tech savvy? And is the quality of the AI on par?

    • utopiah@lemmy.ml
      link
      fedilink
      arrow-up
      21
      ·
      11 months ago

      Check my notes https://fabien.benetou.fr/Content/SelfHostingArtificialIntelligence but as others suggested a good way to start is probably https://github.com/ollama/ollama/ and if you need a GUI https://gpt4all.io

      • irreticent@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        11 months ago

        I’m not the person who asked, but still thanks for the information. I might give this a try soon.

        • classic@fedia.io
          link
          fedilink
          arrow-up
          2
          ·
          11 months ago

          Ditto, thanks to everyone’s for their suggestions

      • Knock_Knock_Lemmy_In@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        11 months ago

        You should have at least 16 GB of RAM available to run the 13B models,

        Is this gpu ram or cpu ram?

        • KillingTimeItself@lemmy.dbzer0.comBanned
          link
          fedilink
          English
          arrow-up
          2
          ·
          11 months ago

          likely GPU ram, there is some tech that can offload ram, but generally it’s all hosted in VRAM, this requirement will likely fade as NPUs start becoming a thing though.

        • MalReynolds@slrpnk.net
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          Either works, but system RAM is at least an order of magnitude slower, more play by mail than chat…

        • reddithalation@sopuli.xyz
          link
          fedilink
          arrow-up
          1
          ·
          11 months ago

          pretty sure it can run on either, but cpus are slow compared to gpus, often to the point of being impractical

    • JPAKx4@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      9
      ·
      11 months ago

      On par? No. Good enough? Definitely. Ollama baby

    • Possibly linux@lemmy.zip
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      11 months ago

      Ollama with Lava and Mistral

    • aStonedSanta@lemm.ee
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      11 months ago

      Your best bet is YouTubing ollama.

Privacy@lemmy.ml

privacy@lemmy.ml

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !privacy@lemmy.ml

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

  • Posting a link to a website containing tracking isn’t great, if contents of the website are behind a paywall maybe copy them into the post
  • Don’t promote proprietary software
  • Try to keep things on topic
  • If you have a question, please try searching for previous discussions, maybe it has already been answered
  • Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
  • Be nice :)

Related communities

  • Lemmy.ml libre_culture
  • Lemmy.ml privatelife
  • Lemmy.ml DeGoogle
  • Lemmy.ca privacy

much thanks to @gary_host_laptop for the logo design :)

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 321 users / day
  • 2.76K users / week
  • 7.14K users / month
  • 16.5K users / 6 months
  • 1 local subscriber
  • 38K subscribers
  • 3.33K Posts
  • 44.3K Comments
  • Modlog
  • mods:
  • k_o_t@lemmy.ml
  • tmpod@lemmy.pt
  • ranok@sopuli.xyz
  • Yayannick@lemmy.ml
  • BE: 0.19.11
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org