Aurion Docs
Api

Get Llm Models

Get available models for an LLM provider. Story 4.5e: Dynamic Model List Fetching Fetches models from provider APIs with 5-minute Redis caching. Note: Uses /llm-models instead of /llm/models to avoid route conflict with /llm/{config_id} path parameter matching. Returns: { models: [{ id: string, name: string, created?: number | string }] }

GET
/api/v1/configuration/llm-models

Get available models for an LLM provider.

Story 4.5e: Dynamic Model List Fetching Fetches models from provider APIs with 5-minute Redis caching.

Note: Uses /llm-models instead of /llm/models to avoid route conflict with /llm/{config_id} path parameter matching.

Returns: { models: [{ id: string, name: string, created?: number | string }] }

AuthorizationBearer <token>

In: header

Query Parameters

provider*string

LLM provider: claude, gpt-4, gpt-5, deepseek, or groq

Match^(claude|gpt-4|gpt-5|deepseek|groq)$

Response Body

application/json

application/json

curl -X GET "https://loading/api/v1/configuration/llm-models?provider=string"
{}
{
  "detail": [
    {
      "loc": [
        "string"
      ],
      "msg": "string",
      "type": "string"
    }
  ]
}