misk@sopuli.xyz to Technology@lemmy.worldEnglish · 11 months agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square233fedilinkarrow-up1897arrow-down118
arrow-up1879arrow-down1external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.comisk@sopuli.xyz to Technology@lemmy.worldEnglish · 11 months agomessage-square233fedilink
minus-squaremerc@sh.itjust.workslinkfedilinkEnglisharrow-up27arrow-down2·11 months agoAre you lost? This is ChatGPT, not Google. Also, it’s “their”.
minus-squareWilliamTheWicked@lemmy.worldlinkfedilinkEnglisharrow-up5arrow-down22·11 months agoDid you even read the explanation part of the article??? Thanks for the grammar correction while ignoring literally all context though. You certainly put me in my place milord.
minus-squarekromem@lemmy.worldlinkfedilinkEnglisharrow-up14·11 months agoWhat’s your beef with Google researchers probing the safety mechanisms of the SotA model? How was that evil?
minus-squareandrai@feddit.delinkfedilinkEnglisharrow-up3arrow-down2·11 months agoNow that Google spilled the beans WilliamTheWicked can no longer extract contact information of females from the ChatGPT training data.
Are you lost? This is ChatGPT, not Google. Also, it’s “their”.
Did you even read the explanation part of the article???
Thanks for the grammar correction while ignoring literally all context though. You certainly put me in my place milord.
What’s your beef with Google researchers probing the safety mechanisms of the SotA model?
How was that evil?
Now that Google spilled the beans WilliamTheWicked can no longer extract contact information of females from the ChatGPT training data.