PayloopPayloop
CommunityVoicesToolsDiscoverLeaderboardReportsBlog
Save Up to 65% on AI
Powered by Payloop — LLM Cost Intelligence
OverviewActivityProjectsAI StackExternalNetwork
BE

Berri AI

Tier 2 (Warm)Unclaimed Profile

@BerriAI

The fastest way to take your LLM app to production

765 followers 152 stars On GitHub since 2022
founders@berri.ai https://docs.litellm.ai/docs/

Analyzed 5h ago

Repos Analyzed

30

With LLM Usage

4

Total Call Sites

10

LLM Providers

2

Stack Complexity
moderate
28/100
AI Providers
2
Frameworks
1
Architecture Overview

LLM Providers

OpenAIAWS Bedrock

Models

gpt-4oclaude-4-sonnet

Frameworks

Interactive Cost Calculator

Estimate costs using published model pricing and your own volume inputs.

Model

Daily API Calls

1,000
10100K

Avg Tokens per Request

1,000
10010,000

gpt-4o pricing

Input: $2.5/1M tokens  ·  Output: $10/1M tokens

These estimates use published model pricing. Actual costs depend on your call volume, prompt length, and response size.

Repositories with LLM Usage (4)

openai-proxy
28/100 moderate
OpenAIAWS Bedrockgpt-4oclaude-4-sonnet

Other repos checked (26)

litellm-pgvectorexample_openai_endpointllm-benchmarkexample_litellm_gcp_cloud_runlitellm-backstageexample_anthropic_endpointlitellm-agent-mcplitellm-observatorylitellm-guardrailsAutomated_Perf_Testsci-cd-fixer-skillfireworks-ai-cost-agent
LiteLLM
LiteLLM
4 call sites0 files checked21
proxy_load_tester
10/100 basic
AWS Bedrockclaude-4-sonnet
2 call sites0 files checked6
simple_proxy_openai
18/100 basic
OpenAIgpt-4oLiteLLM
2 call sites0 files checked4
simple_litellm_proxy
18/100 basic
OpenAIgpt-4oLiteLLM
2 call sites0 files checked2
litellm-skills
openai-proxy-ab-testing
provider-litellm
locust-load-tester
litellm-performance-benchmarks
mock-token-exchange-server
mock-oauth2-mcp-server
serxng-deployment
n8n-nodes-litellm
assistants_test_project
LiteLLM-Performance
error-logs-ui
litellm-linear-mcp
mock-mcp-server