get__ai_llm_stats

⚠️ TEST ENVIRONMENT ONLY

This API endpoint is currently available only in the TEST environment. It is not yet available in production.

GET /ai/llm/stats

Get AI LLM Statistics

Retrieve usage statistics for AI/LLM configurations. This endpoint provides comprehensive metrics about API usage, token consumption, and policy enforcement.

Query Filtering:

  • Optionally filter by aiLlmId to get statistics for a specific configuration

  • Without filters, returns aggregated statistics across all your accessible configurations

Metrics Provided:

  • Total API Calls: Number of API requests made to the AI provider

  • Total Tokens In: Tokens consumed in requests

  • Total Tokens Out: Tokens generated in responses

  • Total LLM Tokens In: Language model-specific input tokens

  • Total LLM Tokens Out: Language model-specific output tokens

  • Policy Violations: Count of policy violations detected

  • Alerts: Number of alerts triggered

  • Blocks: Number of blocked requests

Use Cases:

  • Monitor AI usage and costs

  • Track compliance with AI policies

  • Analyze token consumption patterns

  • Identify potential issues or violations

TypeScript Client Library

public getAiLlmStats = async (aiLlmId?: string): Promise<AiLlmStatsResponse> => {
  const queryParams = new URLSearchParams();
  if (aiLlmId) {
    queryParams.append('aiLlmId', aiLlmId);
  }
  return this.makeRequest<AiLlmStatsResponse>(
    `ai/llm/stats${queryParams.toString() ? `?${queryParams.toString()}` : ''}`,
    'GET',
    null
  );
};

Code Samples

# You can also use wget
curl -X GET https://backend.flashback.tech/ai/llm/stats?aiLlmId=550e8400-e29b-41d4-a716-446655440000 \
  -H 'Accept: application/json' \
  -H 'Authorization: Bearer {access-token}'

Parameters

Name
In
Type
Required
Description

aiLlmId

query

string

false

Filter statistics by AI LLM configuration ID

Example responses

200 Response

{
  "success": true,
  "stats": [
    {
      "aiLlmId": "550e8400-e29b-41d4-a716-446655440000",
      "totalApiCalls": "1523",
      "totalTokensIn": "45678",
      "totalTokensOut": "89012",
      "totalLlmTokensIn": "43210",
      "totalLlmTokensOut": "87654",
      "totalPolicyViolations": "3",
      "totalAlerts": "5",
      "totalBlocks": "2"
    }
  ]
}

Responses

Status
Meaning
Description
Schema

200

Successfully retrieved statistics

Inline

403

Insufficient permissions

Inline

500

Failed to retrieve statistics

Inline

Response Schema

Status Code 200

Name
Type
Required
Restrictions
Description

» success

boolean

false

none

Operation success status

» stats

[object]

false

none

Array of statistics objects

»» aiLlmId

string

false

none

AI LLM configuration ID

»» totalApiCalls

string

false

none

Total number of API calls made

»» totalTokensIn

string

false

none

Total tokens consumed in requests

»» totalTokensOut

string

false

none

Total tokens generated in responses

»» totalLlmTokensIn

string

false

none

Total LLM-specific input tokens

»» totalLlmTokensOut

string

false

none

Total LLM-specific output tokens

»» totalPolicyViolations

string

false

none

Total policy violations detected

»» totalAlerts

string

false

none

Total alerts triggered

»» totalBlocks

string

false

none

Total requests blocked

Note: Numeric values are returned as strings to support large numbers without precision loss.

Status Code 403

Name
Type
Required
Restrictions
Description

» success

boolean

false

none

none

» message

string

false

none

none

Status Code 500

Name
Type
Required
Restrictions
Description

» success

boolean

false

none

none

» message

string

false

none

none

To perform this operation, you must be authenticated by means of one of the following methods: BearerAuth

Last updated

Was this helpful?