RSS Bot@lemmy.bestiver.seMB to Hacker News@lemmy.bestiver.seEnglish · 2 months agoAI Error May Have Contributed to Girl's School Bombing in Iranthisweekinworcester.comexternal-linkmessage-square28linkfedilinkarrow-up1116arrow-down17file-textcross-posted to: [email protected]
arrow-up1109arrow-down1external-linkAI Error May Have Contributed to Girl's School Bombing in Iranthisweekinworcester.comRSS Bot@lemmy.bestiver.seMB to Hacker News@lemmy.bestiver.seEnglish · 2 months agomessage-square28linkfedilinkfile-textcross-posted to: [email protected]
minus-squareBasic Glitch@sh.itjust.workslinkfedilinkEnglisharrow-up2·2 months ago when talking about a LLM making someone go off the rails or killing themselves The warning would be for LLMs/chat bots that make people kill themselves. Automated killing systems (like lavender) are use of technology as a weapon of mass destruction. It’s working as intended and the people who created, enabled, and used it should be held accountable.
minus-squareNihilsineNefas@slrpnk.netlinkfedilinkEnglisharrow-up2·2 months agoNo arguments from me. Even if the companies developing these programs are one and the same.
The warning would be for LLMs/chat bots that make people kill themselves.
Automated killing systems (like lavender) are use of technology as a weapon of mass destruction.
No arguments from me. Even if the companies developing these programs are one and the same.