githubEdit

get__aistats_daily

GET /aistats/daily

Get Daily AI Statistics

Get daily aggregated AI statistics for LLM usage, including token counts, API calls, policy violations, and performance metrics.

Parameters

Name
In
Type
Required
Description

startDate

query

string(date)

false

Start date (ISO format)

endDate

query

string(date)

false

End date (ISO format)

repoId

query

string

false

Repository ID filter (comma-separated for multiple values)

aiLlmId

query

string

false

AI LLM ID filter (comma-separated for multiple values)

repoAiApiKeyId

query

string

false

Repository AI API Key ID filter (comma-separated for multiple values)

hosts

query

string

false

Host filter (comma-separated for multiple values)

llmType

query

string

false

LLM type filter (comma-separated for multiple values, e.g., "OPENAI", "ANTHROPIC")

llmModel

query

string

false

LLM model filter (comma-separated for multiple values, e.g., "gpt-4", "claude-3")

TypeScript Client Library

// Using the Flashback TypeScript client
import { FlashbackClient } from '@flashbacktech/flashbackclient';

const client = new FlashbackClient({
  accessToken: 'your-access-token'
});

// Get daily AI statistics with optional filters
try {
  const result = await client.getAiStatsDaily({
    startDate: new Date('2024-01-01'),
    endDate: new Date('2024-01-31'),
    repoId: ['repo-id-1', 'repo-id-2'],
    aiLlmId: ['llm-id-1'],
    repoAiApiKeyId: ['api-key-id-1'],
    hosts: ['host1.example.com', 'host2.example.com'],
    llmType: ['OPENAI', 'ANTHROPIC'],
    llmModel: ['gpt-4', 'claude-3']
  });
  console.log('Daily AI statistics:', result);
} catch (error) {
  console.error('Failed to retrieve daily AI statistics:', error);
}

Code Samples

Example responses

200 Response

Responses

Status
Meaning
Description
Schema

200

Daily AI statistics

Inline

400

Invalid request parameters

Inline

404

User not found

Inline

500

Internal server error

Inline

Response Schema

Status Code 200

Name
Type
Required
Restrictions
Description

» success

boolean

true

none

Indicates if the request was successful

» data

[object]

true

none

Array of daily AI statistics records

»» timestamp

integer

true

none

Unix timestamp (seconds) for the statistics record

»» repoId

string

true

none

Repository ID

»» aiLlmId

string

true

none

AI LLM ID

»» repoAiApiKeyId

string

true

none

Repository AI API Key ID

»» tokensIn

string

true

none

Total input tokens (as string to handle large numbers)

»» tokensOut

string

true

none

Total output tokens (as string to handle large numbers)

»» llmTokensIn

string

true

none

LLM input tokens (as string to handle large numbers)

»» llmTokensOut

string

true

none

LLM output tokens (as string to handle large numbers)

»» activeConversations

integer

true

none

Number of active conversations

»» apiCalls

integer

true

none

Number of API calls

»» policyViolations

integer

true

none

Number of policy violations

»» numAlerts

integer

true

none

Number of alerts triggered

»» numBlocks

integer

true

none

Number of blocked requests

»» numSeverityLow

integer

true

none

Number of policy violations with LOW severity

»» numSeverityMedium

integer

true

none

Number of policy violations with MEDIUM severity

»» numSeverityHigh

integer

true

none

Number of policy violations with HIGH severity

»» latency_ms

number

true

none

Average latency in milliseconds

»» llmType

string

true

none

Type of LLM provider (e.g., "openai", "anthropic")

»» llmModel

string

true

none

LLM model name (e.g., "gpt-4", "claude-3")

»» host

string

true

none

Host/domain of the LLM API endpoint

Status Code 400

Name
Type
Required
Restrictions
Description

» success

boolean

true

none

Always false for error responses

» message

string

true

none

Error message

Status Code 404

Name
Type
Required
Restrictions
Description

» success

boolean

true

none

Always false for error responses

» message

string

true

none

Error message

Status Code 500

Name
Type
Required
Restrictions
Description

» success

boolean

true

none

Always false for error responses

» message

string

true

none

Error message

Last updated

Was this helpful?