MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
malicious
Recherche

Psst … wanna jailbreak ChatGPT? Thousands of malicious prompts for sale

jeudi 25 janvier 2024, 12:01 , par TheRegister
Turns out it's pretty easy to make the model jump its own guardrails
Criminals are getting increasingly adept at crafting malicious AI prompts to get data out of ChatGPT, according to Kaspersky, which spotted 249 of these being offered for sale online during 2023.…
https://go.theregister.com/feed/www.theregister.com/2024/01/25/dark_web_chatgpt/

Voir aussi

News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Date Actuelle
sam. 18 mai - 13:03 CEST