@BerriAI
The fastest way to take your LLM app to production
Analyzed 3d ago
Showing 30 of 30 repositories
Est. $80/mo
An example OpenAI /chat/completions endpoint
List of good LLM Benchmarks
Example Repo to deploy LiteLLM Proxy (AI Gateway) on GCP Cloud Run
Est. $94/mo
Est. $66/mo
An example anthropic API Endpoint
MCP server giving AI agents access to 100+ LLMs through LiteLLM
End-to-end testing suite for LiteLLM deployments - provider tests, performance metrics, and API validation
Registry of public custom code guardrails for the litellm proxy server
Skill for AI agents to diagnose and fix CI/CD failures
Est. $66/mo
Agent Skills for managing live LiteLLM proxy deployments — users, teams, keys, orgs, models, MCP servers, agents
LiteLLM Gateway (Proxy) crossplane provider
A reproducible benchmarking suite for measuring LiteLLM latency, throughput, and scalability under real-world workloads.
Mock OAuth 2.0 Token Exchange Server (RFC 8693) for testing LiteLLM OBO flow
A mock OAuth2 + MCP (Model Context Protocol) server for testing client_credentials flows. Useful for E2E testing LiteLLM proxy MCP OAuth2 M2M authentication.
test project using OpenAI assistants API
Performance tracking and benchmarks for LiteLLM.
Mock MCP Server (MCP SDK) for testing LiteLLM OBO token exchange