AI security and a16z: Crawling with badness
Major security issues revealed in audit of a16z reference LLM architecture
You might want to know what 🕷️security pests🕷️ the reference chatbot architecture from a16z is harboring before you use it… and what it means for the state of AI security!
Background and core concepts
Ask Astro is a chatbot that provides an end-to-end example of a Q&A LLM application used to answer questions about Apache Airflow and its maker, Astronomer. In order to make the responses as factual and accurate as possible, it’s generally best practice to use Retrieval Augmented Generation (RAG), which is exactly what Ask Astro does.
RAG is a handy technique for forcing your AI system to search a knowledge base of your choosing.
It’s okay if you’re never heard of Ask Astro or RAG, since that’s not the point.
The point is that you’ve likely heard of a16z. If you haven’t, you’re probably not in tech, since a16z — abbrev. of Andreessen Horowitz — is a hallowed VC firm with a massive AI portfolio and a sterling reputation for providing some of the most solid startup resources in existence. So if these guys tell startups “here’s how you should build it” then startups are likely to listen.
Ask Astro reveals a lot about best-in-class AI “security” because it’s based on a16z’s blueprints.