Use Any LLM Across Multiple Platforms With Scorecard’s AI Proxy

Scorecard leverages the open source AI Proxy LiteLLM to allow you to manage multiple LLM APIs and easily swap out different AI models.

LiteLLM AI Proxy
LiteLLM AI Proxy

Simplified LLM Management

The AI Proxy allows you to manage multiple LLM APIs using a standardized format. This makes it easy to switch between different platforms without rewriting your code.

  • Unified API: Call over 100 different LLMs across Azure, OpenAI, Cohere, Anthropic, and Replicate using the same API format.
  • Load Balancing and Fallbacks: Ensure high availability and efficient utilization of different LLM services with built-in load balancing and fallback mechanisms.

Full Control With Scorecard Self-Hosting Option

Want to have full control over your LLM management? Scorecard supports self-hosting for the LiteLLM AI Proxy, giving you the flexibility to manage your implementation according to your specific needs. Send us an email and we are happy to discuss!