A new attack has been developed that can extract hidden architectural details from GPT-4, PaLM, and other models using just API access and $20.
•1m read time• From notes.aimodels.fyi
Sort:
A new attack has been developed that can extract hidden architectural details from GPT-4, PaLM, and other models using just API access and $20.
Sort: