• Günther Unlustig 🍄@slrpnk.net
    link
    fedilink
    arrow-up
    3
    ·
    2 months ago

    While I definitely do not want a LLM (especially not Open AI or whatever) to have access to my terminal or other stuff on my PC, and in general don’t have any use for that, I find it cool that something like this is available now.

    Remember, it’s totally optional and nobody forces you to download that stuff. You have the choice to ignore it, and that’s the great thing about Linux!

  • Mwa@thelemmy.club
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 months ago

    From the title I thought Gnome foundation made a Ai Client for a sec, Until I read the article.

  • IHave69XiBucks@lemmygrad.ml
    link
    fedilink
    arrow-up
    1
    ·
    2 months ago

    Idk why people don’t read the article before commenting.

    Newelle supports interfacing with the Google Gemini API, the OpenAI API, Groq, and also local large language models (LLMs) or ollama instances for powering this AI assistant.

    So you configure it with your prefered model which can include a locally run one. And it seems to be its own package not something built into gnome itself so you an easily uninstall it if you won’t use it.

    Seems fine to me. I probably won’t be using it, but it’s an interesting idea. Being able to run terminal commands seems risky though. What if the AI bricks my system? Hopefully they make you confirm every command before it runs any of them or something.

  • just_another_person@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    2 months ago

    Or, ORRRR…just do the stuff yourself and don’t further perpetuate this dumbshit until it doesn’t require an entire months worth of energy for an efficient home to run to search “Hentai Alien Tentacle Porn” for you.

    Buncha savages.

  • nymnympseudonym@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    I haven’t tested this but TBH as someone who has run Linux at home for 25 years I love the idea of an always alert sysadmin keeping my machine maintained and configured to my specs. Keep my IDS up to date. And so on.

    Two requirements:

    1 Be an open source local model with no telemetry

    2 Let me review proposed changes to my system and explain why they should be made

    • shiroininja@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      Like what do you need to keep configured? lol Linux is set it and forget it. I’ve had installs be fine from day one to year 7. It’s not like windows where Microsoft is constantly changing things and changing your settings. Like it takes minimum effort to keep a Linux server/system going after initial configuration.

  • DonutsRMeh@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    For some reason, these local LLMS are straight up stupid. I tried deepseek R1 through ollama and it was straight up stupid and gave everything wrong. Anyone got the same results? I did the 7b and 14b (if I remember these numbers correctly), 32 straight up didn’t install because I didn’t have enough RAM.

    • felsiq@piefed.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Did you use a heavily quantized version? Those models are much smaller than the state of the art ones to begin with, and if you chop their weights from float16 to float2 or something it reduces their capabilities a lot more

    • teawrecks@sopuli.xyz
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      2 months ago

      The performance is relative to the user. Could it be that you’re a god damned genius? :/