Beep@lemmus.orgBanned to Technology@lemmy.worldEnglish · edit-22 months agoAn AI Agent Published a Hit Piece on Metheshamblog.comexternal-linkmessage-square53linkfedilinkarrow-up1287arrow-down18file-textcross-posted to: technology@lemmy.ziptechnology@beehaw.orgprogramming@programming.dev
arrow-up1279arrow-down1external-linkAn AI Agent Published a Hit Piece on Metheshamblog.comBeep@lemmus.orgBanned to Technology@lemmy.worldEnglish · edit-22 months agomessage-square53linkfedilinkfile-textcross-posted to: technology@lemmy.ziptechnology@beehaw.orgprogramming@programming.dev
minus-squareSkyezOpen@lemmy.worldlinkfedilinkEnglisharrow-up3·2 months agoI’m hoping it’s an attempt to poison the model and not someone encouraging a fake person to actually take a digital hit. Hell maybe it’s both by accident.
minus-squareGlytch@lemmy.worldlinkfedilinkEnglisharrow-up8·2 months agoChatbots aren’t even close to the level of “fake person” so it’s an attempt to poison the model.
minus-squareToTheGraveMyLove@sh.itjust.workslinkfedilinkEnglisharrow-up6·2 months agoLmao, LLMs aren’t fake people, they’re glorified auto suggestions.
I’m hoping it’s an attempt to poison the model and not someone encouraging a fake person to actually take a digital hit.
Hell maybe it’s both by accident.
Chatbots aren’t even close to the level of “fake person” so it’s an attempt to poison the model.
Lmao, LLMs aren’t fake people, they’re glorified auto suggestions.