• sobchak@programming.dev
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    2
    ·
    23 hours ago

    I don’t think LLMs will become AGI, but… planes don’t fly by flapping their wings. We don’t necessarily need to know how animal brains work to achieve AGI, and it doesn’t necessarily have to work anything like animal brains. It’s quite possible if/when AGI is achieved, it will be completely alien.

    • grapefruittrouble@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      We don’t necessarily need to know how animal brains work to achieve AGI, and it doesn’t necessarily have to work anything like animal brains.

      100% agree. Definitely thinking inside the box, inside the brain, when I went down that path.

      I think better way to explain my thinking is that LLMs can not operate like a human brain in that they fundamentally lack almost all qualities of a human brain. They are good but not perfect at logic just like humans, but they completely lack creativity, intuition, imagination, emotion and common sense, qualities that would make AGI.

      Without humans being able to understand how our brains process those qualities, it will be very hard to achieve AGI. But again, very wrong of me to think we need to translate code from our brains to achieve AGI.

    • qaeta@lemmy.ca
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      2
      ·
      17 hours ago

      Aircraft wings operate on pretty much the same principle as bird wings do. We just used a technology we had already developed (fans, essentially) to create the forward movement necessary to create the airflow over the wings for lift. We know how to do it the bird way too, but restrictions in material science at scale make the fan method far easier and less error prone.