LLM keys - A Proposal of a Solution to Prompt Injection Attacks
Disclaimer: This is just an untested rough sketch of a solution which I believe should work. I'm posting it mostly to crowdsource reasons why this wouldn't work. Motivated by Amjad Masad and Zvi conjecturing it might be fundamentally unsolvable. The situation * we, as LLM creators, want to have the...