Kid@sh.itjust.worksM to Cybersecurity@sh.itjust.worksEnglish · 14 days agoTime Bandit ChatGPT jailbreak bypasses safeguards on sensitive topicswww.bleepingcomputer.comexternal-linkmessage-square7fedilinkarrow-up147arrow-down10
arrow-up147arrow-down1external-linkTime Bandit ChatGPT jailbreak bypasses safeguards on sensitive topicswww.bleepingcomputer.comKid@sh.itjust.worksM to Cybersecurity@sh.itjust.worksEnglish · 14 days agomessage-square7fedilink
minus-squareandrewth09@lemmy.worldlinkfedilinkEnglisharrow-up8·13 days agoI don’t understand why the researcher needed to content the FBI to report this, just drop it in BugCrowd and call it a day. It’s a ChatGPT jailbreak, not a Debian Zero-day.
I don’t understand why the researcher needed to content the FBI to report this, just drop it in BugCrowd and call it a day. It’s a ChatGPT jailbreak, not a Debian Zero-day.