Researchers convinced ChatGPT to do things it normally wouldn’t with basic psychology.

  • nymnympseudonym@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 days ago

    we think it’s just words, but our brain will seamlessly weave inner monologue into concepts

    Are you familiar with latent space representation?

    Because yes, that’s how LRM’s work, cycling tokens in latent space multiple times before sending to upper layers and decoding into human words

    https://arxiv.org/abs/2412.06769