We Added PII Masking to Our AI Stack. Here's Exactly What Happened.
Presidio and LiteLLM deployed as a PII masking layer on an Ollama stack -- every undocumented env var, every silent failure, and the one-liner that proves the DLP never fired on real traffic. Six confirmed findings, zero service failures.