PII Masking
LiteLLM supports Microsoft Presidio for PII masking.
Quick Start
Step 1. Add env
export PRESIDIO_ANALYZER_API_BASE="http://localhost:5002"
export PRESIDIO_ANONYMIZER_API_BASE="http://localhost:5001"
Step 2. Set it as a callback in config.yaml
litellm_settings:
callbacks = ["presidio", ...] # e.g. ["presidio", custom_callbacks.proxy_handler_instance]
Step 3. Start proxy
litellm --config /path/to/config.yaml
This will mask the input going to the llm provider
Output parsing
LLM responses can sometimes contain the masked tokens.
For presidio 'replace' operations, LiteLLM can check the LLM response and replace the masked token with the user-submitted values.
Just set litellm.output_parse_pii = True
, to enable this.
litellm_settings:
output_parse_pii: true
Expected Flow:
User Input: "hello world, my name is Jane Doe. My number is: 034453334"
LLM Input: "hello world, my name is [PERSON]. My number is: [PHONE_NUMBER]"
LLM Response: "Hey [PERSON], nice to meet you!"
User Response: "Hey Jane Doe, nice to meet you!"