Navigation
Recherche
|
Psst … wanna jailbreak ChatGPT? Thousands of malicious prompts for sale
jeudi 25 janvier 2024, 12:01 , par TheRegister
Turns out it's pretty easy to make the model jump its own guardrails
Criminals are getting increasingly adept at crafting malicious AI prompts to get data out of ChatGPT, according to Kaspersky, which spotted 249 of these being offered for sale online during 2023.…
https://go.theregister.com/feed/www.theregister.com/2024/01/25/dark_web_chatgpt/
Voir aussi |
56 sources (32 en français)
Date Actuelle
sam. 18 mai - 13:03 CEST
|