Skip to main content

Logging GCS, s3 Buckets

LiteLLM Supports Logging to the following Cloud Buckets

Logging Proxy Input/Output to Google Cloud Storage Buckets

Log LLM Logs to Google Cloud Storage Buckets

info

✨ This is an Enterprise only feature Get Started with Enterprise here

Usage

  1. Add gcs_bucket to LiteLLM Config.yaml
model_list:
- litellm_params:
api_base: https://openai-function-calling-workers.tasslexyz.workers.dev/
api_key: my-fake-key
model: openai/my-fake-model
model_name: fake-openai-endpoint

litellm_settings:
callbacks: ["gcs_bucket"] # 👈 KEY CHANGE # 👈 KEY CHANGE
  1. Set required env variables
GCS_BUCKET_NAME="<your-gcs-bucket-name>"
GCS_PATH_SERVICE_ACCOUNT="/Users/ishaanjaffer/Downloads/adroit-crow-413218-a956eef1a2a8.json" # Add path to service account.json
  1. Start Proxy
litellm --config /path/to/config.yaml
  1. Test it!
curl --location 'http://0.0.0.0:4000/chat/completions' \
--header 'Content-Type: application/json' \
--data ' {
"model": "fake-openai-endpoint",
"messages": [
{
"role": "user",
"content": "what llm are you"
}
],
}
'

Expected Logs on GCS Buckets

Fields Logged on GCS Buckets

Example payload of a /chat/completion request logged on GCS

{
"request_kwargs": {
"model": "gpt-3.5-turbo",
"messages": [
{
"role": "user",
"content": "This is a test"
}
],
"optional_params": {
"temperature": 0.7,
"max_tokens": 10,
"user": "ishaan-2",
"extra_body": {}
}
},
"response_obj": {
"id": "chatcmpl-bd836a8c-89bc-4abd-bee5-e3f1ebfdb541",
"choices": [
{
"finish_reason": "stop",
"index": 0,
"message": {
"content": "Hi!",
"role": "assistant",
"tool_calls": null,
"function_call": null
}
}
],
"created": 1722868456,
"model": "gpt-3.5-turbo",
"object": "chat.completion",
"system_fingerprint": null,
"usage": {
"prompt_tokens": 10,
"completion_tokens": 20,
"total_tokens": 30
}
},
"start_time": "2024-08-05 07:34:16",
"end_time": "2024-08-05 07:34:16"
}

Getting service_account.json from Google Cloud Console

  1. Go to Google Cloud Console
  2. Search for IAM & Admin
  3. Click on Service Accounts
  4. Select a Service Account
  5. Click on 'Keys' -> Add Key -> Create New Key -> JSON
  6. Save the JSON file and add the path to GCS_PATH_SERVICE_ACCOUNT

Logging Proxy Input/Output - s3 Buckets

We will use the --config to set

  • litellm.success_callback = ["s3"]

This will log all successfull LLM calls to s3 Bucket

Step 1 Set AWS Credentials in .env

AWS_ACCESS_KEY_ID = ""
AWS_SECRET_ACCESS_KEY = ""
AWS_REGION_NAME = ""

Step 2: Create a config.yaml file and set litellm_settings: success_callback

model_list:
- model_name: gpt-3.5-turbo
litellm_params:
model: gpt-3.5-turbo
litellm_settings:
success_callback: ["s3"]
s3_callback_params:
s3_bucket_name: logs-bucket-litellm # AWS Bucket Name for S3
s3_region_name: us-west-2 # AWS Region Name for S3
s3_aws_access_key_id: os.environ/AWS_ACCESS_KEY_ID # us os.environ/<variable name> to pass environment variables. This is AWS Access Key ID for S3
s3_aws_secret_access_key: os.environ/AWS_SECRET_ACCESS_KEY # AWS Secret Access Key for S3
s3_path: my-test-path # [OPTIONAL] set path in bucket you want to write logs to
s3_endpoint_url: https://s3.amazonaws.com # [OPTIONAL] S3 endpoint URL, if you want to use Backblaze/cloudflare s3 buckets

Step 3: Start the proxy, make a test request

Start proxy

litellm --config config.yaml --debug

Test Request

curl --location 'http://0.0.0.0:4000/chat/completions' \
--header 'Content-Type: application/json' \
--data ' {
"model": "Azure OpenAI GPT-4 East",
"messages": [
{
"role": "user",
"content": "what llm are you"
}
]
}'

Your logs should be available on the specified s3 Bucket