Navigation
Recherche
|
'Skeleton Key' attack unlocks the worst of AI, says Microsoft
vendredi 28 juin 2024, 08:38 , par TheRegister
Simple jailbreak prompt can bypass safety guardrails on major models
Microsoft on Thursday published details about Skeleton Key – a technique that bypasses the guardrails used by makers of AI models to prevent their generative chatbots from creating harmful content.…
https://go.theregister.com/feed/www.theregister.com/2024/06/28/microsoft_skeleton_key_ai_attack/
Voir aussi |
56 sources (32 en français)
Date Actuelle
mar. 5 nov. - 11:31 CET
|