Apparently mozilla wants the right to get data from firefox users. But not like general information, they want to know what data you upload or download through firefox.

Without it, we couldn’t use information typed into Firefox, for example.

What the fuck? I use firefox because I want privacy!!! Not sharing my information with a company.

We need a license to allow us to make some of the basic functionality of Firefox possible. WHY DO YOU NEED MY DATA TO MAKE FIREFOX WORK???

  • Vivendi@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    8
    ·
    22 hours ago

    Because they’re going to offer optional AI integration in the browser

    Look, we are in the minority. Lemmy has like, what, some thousands of users at best. Most people want features like AI, and firefox has to offer them to gain users instead of bleeding like a stuck pig as they’re doing now

    And the sentiment against AI on lemmy absolutely sucks dick. I’m tempted to go back to Reddit because this place is luddite.ml at this point. People who’ve never spun up a local model on their GPU talk like they have trifecta PhD creds in comp sci

    • kipo@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      21 hours ago

      Do you think sucking dick is a bad thing?

    • pogmommy@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      21 hours ago

      Why the hell does my HTML renderer need a baked-in frontend for a privacy nightmare that requires I sign over rights for every keystroke even if I don’t use it? If Mozilla really wants to roll it out themselves, make it an extension available for download and tie your abysmal terms of use to that, don’t upend your entire reputation for the latest big tech trend.

      And the sentiment against AI on lemmy absolutely sucks dick. I’m tempted to go back to Reddit because this place is luddite.ml at this point. People who’ve never spun up a local model on their GPU talk like they have trifecta PhD creds in comp sci

      Oh poor you, how dare people appropriately respond to foil-plated shit being advertised to us as platinum decor. Stay away from glue on days you use your computer

    • lime!@feddit.nu
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      22 hours ago

      Most people want features like AI

      That’s just not true.

      as someone who runs local inference all the time, i think that centralized online models have no place anywhere near consumers. partly because the things they offer are trivial and offload critical skills, partly because they require insane amounts of energy, and partly because they are privacy nightmares. all things that are against moz’s stated mission. and yet here we are.

      • Vivendi@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 hours ago

        This is the only decent comment that I’ve got on this entire thread

        And you seem to be right, I agree with your sentiment. If the features are unneeded, then this is the wrong step to take. The best option would be for Mozilla (who are behind Ollama I think?) to offer a locally run open model, any potato capable of running a modern browser should be able to handle 1B-7B models

        • lime!@feddit.nu
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 hours ago

          moz are behind llamafiles, but ollama is a separate entity.

          also, chat models are just not that useful. i’m all for their local translation models and the like, but chat is just a toy.