PayloopPayloop
CommunityVoicesToolsDiscoverLeaderboardReportsBlog
Save Up to 65% on AI
Powered by Payloop — LLM Cost Intelligence
OverviewActivityProjectsAI StackExternalNetwork
BE

Berri AI

Tier 2 (Warm)Unclaimed Profile

@BerriAI

The fastest way to take your LLM app to production

765 followers 152 stars On GitHub since 2022
founders@berri.ai https://docs.litellm.ai/docs/

Analyzed 3d ago

Showing 30 of 30 repositories

litellm-pgvector
Python57
openai-proxy
AI28/100
Python214 call sites
OpenAIAWS Bedrockgpt-4oclaude-4-sonnetLiteLLM

Est. $80/mo

example_openai_endpoint

An example OpenAI /chat/completions endpoint

Python11
llm-benchmark

List of good LLM Benchmarks

8
example_litellm_gcp_cloud_run

Example Repo to deploy LiteLLM Proxy (AI Gateway) on GCP Cloud Run

Dockerfile7
proxy_load_tester
AI10/100
Python62 call sites
AWS Bedrockclaude-4-sonnet

Est. $94/mo

simple_proxy_openai
AI18/100
Python42 call sites
OpenAIgpt-4oLiteLLM

Est. $66/mo

litellm-backstage
TypeScript4
example_anthropic_endpoint

An example anthropic API Endpoint

Python4
litellm-agent-mcp

MCP server giving AI agents access to 100+ LLMs through LiteLLM

Python3
litellm-observatory

End-to-end testing suite for LiteLLM deployments - provider tests, performance metrics, and API validation

Python3
litellm-guardrails

Registry of public custom code guardrails for the litellm proxy server

Svelte3
Automated_Perf_Tests
Python3
ci-cd-fixer-skill

Skill for AI agents to diagnose and fix CI/CD failures

3
simple_litellm_proxy
AI18/100
Python22 call sites
OpenAIgpt-4oLiteLLM

Est. $66/mo

fireworks-ai-cost-agent
Python2
litellm-skills

Agent Skills for managing live LiteLLM proxy deployments — users, teams, keys, orgs, models, MCP servers, agents

Shell2
openai-proxy-ab-testing
Python2
provider-litellm

LiteLLM Gateway (Proxy) crossplane provider

Go2
locust-load-tester
Python1
litellm-performance-benchmarks

A reproducible benchmarking suite for measuring LiteLLM latency, throughput, and scalability under real-world workloads.

Python1
mock-token-exchange-server

Mock OAuth 2.0 Token Exchange Server (RFC 8693) for testing LiteLLM OBO flow

Python1
mock-oauth2-mcp-server

A mock OAuth2 + MCP (Model Context Protocol) server for testing client_credentials flows. Useful for E2E testing LiteLLM proxy MCP OAuth2 M2M authentication.

Python1
serxng-deployment
Dockerfile1
n8n-nodes-litellm
assistants_test_project

test project using OpenAI assistants API

LiteLLM-Performance

Performance tracking and benchmarks for LiteLLM.

error-logs-ui
JavaScript
litellm-linear-mcp
TypeScript
mock-mcp-server

Mock MCP Server (MCP SDK) for testing LiteLLM OBO token exchange

Python