• Ilovethebomb@lemm.ee
    link
    fedilink
    arrow-up
    69
    ·
    1 month ago

    I think creatives on Facebook overestimate their value to an extent, I use Facebook to see what my friends and family are doing, and because the sports I do are mostly arranged through Facebook pages.

    The webcomics etc I follow are cool, but if they stopped posting I wouldn’t really miss them all that much.

    • IllNess@infosec.pub
      link
      fedilink
      arrow-up
      35
      ·
      1 month ago

      Meta also owns Instagram. The comment in the comic would have better directed at that platform

    • AwkwardLookMonkeyPuppet@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 month ago

      I actually stopped using Facebook because they started forcing all this other shit on me. Like you said, I used it to keep in contact with friends and family. The site has made it increasingly harder to do that, to the point where now it’s 95% shit I never agreed to see. So I just stopped going there. It’s sad because it was a a great platform for friends and family.

      • Ilovethebomb@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        1 month ago

        I think they eased up on that, I just didn’t log in for quite a while because it was all recommendations for meme pages.

    • DashboTreeFrog@discuss.online
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      1 month ago

      Agree, Facebook is not the platform fot creatives, especially not comic creators.

      I was a huge Webcomic fan back in the days of things like Mega Tokyo, Real Life Comics, etc., and people used to make a living hosting their own content on their own sites making their own ad revenue and merch shops. Kinda hate that social media consolidation killed all that. I actually want to catch up with Something Positive since the creator has been making podcasts appearances and reminded me he exists and seeing his site still looking the way it did a decade ago is like looking into a time machine to when the internet was better.

      Sorry for the rant, webcomics have just been on my mind and your comment helped open the floodgates 😅

    • WaxiestSteam69@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      I use it for the same reasons as you and I also follow local news outlets and local government since Twitter went to crap. I follow a few special interest groups but political posts, probably by bots is making most of these groups unusable. I couldn’t care less about the content. I go to other platforms for that

  • dan@upvote.au
    link
    fedilink
    arrow-up
    38
    arrow-down
    4
    ·
    edit-2
    1 month ago

    Creators on Facebook do get paid though, at least if they’re big enough I guess 🤔 https://creators.facebook.com/earn-money

    Also the AI model Meta maintains (Llama) is the most powerful open-source model that anyone can use and even build their own commercial products on top of for free, so I’m not sure it’s accurate that nobody wants it?

    • Ephera@lemmy.ml
      link
      fedilink
      English
      arrow-up
      15
      ·
      1 month ago

      Only the inference code of LLaMA (which runs the model) is open-source. The model itself is not, as you’re given neither the training data, nor the model weights.

      • dan@upvote.au
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        1 month ago

        I don’t know much about AI models, but that’s still more than other vendors are giving away, right? Especially "Open"AI. A lot of people just care if they can use the model for free.

        How useful would the training data be? Training of the largest Llama model was done on a cluster of over 100,000 Nvidia H100s so I’m not sure how many people would want to repeat that.

        • baguettefish@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          7
          ·
          1 month ago

          scientific institutions and governments could rent enough GPUs to train their own models, with potentially public funding and public accountability, and also it’d be nice to know if the data llama was trained with was literally just facebook user data. i’m not really in the camp of “if user content is on my site then the content belongs to me”.

        • Martineski@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 month ago

          Without the same training data you wouldn’t be able to recreate the results even when having the computing power. Thus it’s not fully open source. Training data is a part of the source to create the result, “LLM”. It’s like having to add your own lines of code to open source program to make it work because the company doesn’t provide it.

        • brucethemoose@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 month ago

          How useful would the training data be

          Open datasets are getting much better (Tulu for an instruct database/recipe is a great example), but its clear the giants still have “secret sauce” that gives them at least a small edge over open datasets.

          There actually seems to be some vindication of using massively multilingual datasets as well, as the hybrid chinese/english models are turning out very good.

        • brucethemoose@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 month ago

          It turns out these clusters are being used very inefficiently, seeing how Qwen 2.5 was trained with a fraction of the GPUs and is clobbering models from much larger clusters.

          One could say Facebook, OpenAI, X and such are “hoarding” H100s but are not pressured to utilize them efficiently since they are so GPU unconstrained.

          Google is an interesting case, as Gemini is getting better quickly, but they presumably use much more efficient/cheap TPUs to train.

      • ryedaft@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        How on earth would you distribute the model for inference without the weights? The gradients are obviously gone so you can’t continue training on the model. Maybe you can still do some kind of LORA?

  • aleq@lemmy.world
    link
    fedilink
    arrow-up
    25
    arrow-down
    8
    ·
    1 month ago

    I know there are some very loud and dedicated haters out there, but “that no one wants”? Bro needs to get out of his bubble.

    • Piatro@programming.dev
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      4
      ·
      1 month ago

      I haven’t heard anyone articulate anything compelling about consumer-marketed AI so please tell me! There’s loads of really good uses of AI (medical imaging seems really promising) but the ones I know about are so specialised that I can’t see why I would need “AI” in my day to day.

      • InFerNo@lemmy.ml
        link
        fedilink
        arrow-up
        2
        arrow-down
        3
        ·
        1 month ago

        Parent council used it to whip up a Halloween story for our event, it fleshed it out which ended up saving time. Needed some shaving, but nothing as intensive as writing out the entire story yourself for something that is essentially a one time thing.

  • OsrsNeedsF2P@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    1 month ago

    The goal is manifold; one for example is to have AI agents talk to your customers in your Instagram shop when they have questions.

    Another example is internal tools at Meta; obviously I can’t go into too much detail, but the AI tools help the development workflow a lot.

    AI also pairs well with the smart glasses like the Raybans or Orion. You might not like it, but having your glasses explain your health insurance at the hospital or be a personal fitness coach at the gym are actually very helpful use cases.

    Source: I work at Meta, and am very bullish on the future of AI