Skip to main content

Step 4: LiteLLM Proxy Model Configuration

The core of the LiteLLM Proxy configuration is the model list, which defines the LLM models the proxy will manage. Sample configuration files are provided in the Helm chart for each major cloud provider.

Configuration File Locations

  • AWS Bedrock: litellm/config/litellm-aws-config.yaml
  • Azure OpenAI: litellm/config/litellm-azure-config.yaml
  • Google Vertex AI: litellm/config/litellm-gcp-config.yaml

Model Configuration

For detailed model configuration examples and provider-specific settings, see LiteLLM Model Configuration.

Next Steps

Continue to Deployment.