Configure an AI LLM
Before following this guide, we strongly recommend reading AI LLM to understand how provider configurations, repositories, AI API keys, and policies work together in Flashback. You can also review the AI LLM Management APIs if you want to automate setup.
This guide is experimental and may evolve with the platform. If something behaves differently in your workspace, please contact us on Discord.
Properties
Each AI LLM configuration uses the following properties:
Configuration Name (required) Human-readable label of the provider connection in your workspace.
AI LLM Type (required) Provider compatibility type used by Flashback to route and validate calls.
API Endpoint (required) Base URL of the provider endpoint (for example
https://api.openai.com/v1).API Key (optional) Optional key field for providers/setups that require an additional access key.
API Secret (required) Main secret/token used to authenticate requests.
Workspace (required, auto-selected) The current workspace where the configuration is created.
Access & security model
AI LLM credentials in Flashback follow the same security principles used for other connected resources:
Secrets are encrypted at rest.
Secrets are never returned in clear text by the APIs.
Access is scoped by workspace permissions.
Configuration names should be unique and explicit to avoid confusion during repository setup.
For the full security model (encryption/decryption behavior, scope, and best practices), see Security and Secret Encryption.
AI LLM Type
When creating a configuration, select the provider type that matches your endpoint:
OPENAI: OpenAI-compatible endpoints.
GOOGLE: Google AI/Gemini-compatible setup.
ANTHROPIC: Anthropic Claude-compatible setup.
AWS: AWS AI services (e.g., Bedrock-compatible usage through your chosen endpoint/proxy).
OTHER: Any custom endpoint compatible with one of the supported API patterns (commonly OpenAI-compatible on-prem/private providers).
All provider types currently use the same form fields in the UI:
Configuration Name
API Endpoint
API Key (optional)
API Secret
Workspace (auto-defined)
Instructions
Here is the step-by-step process to connect an AI LLM provider in Flashback Platform.
Fill in configuration fields
Complete the form with your provider values:
Configuration Name: clear unique name (example:
prod-openai-gateway).API Endpoint: provider base URL.
API Key: optional, only if your provider requires this extra key.
API Secret: required provider secret/token.
Workspace: already selected from your current workspace context.
Double-check endpoint protocol (https://), path (/v1 when needed), and token format before saving. Most validation failures come from endpoint typos or invalid/expired secrets.
Troubleshooting tips
Validation fails with authentication error: verify API Secret (and API Key if used), then rotate credentials.
Provider returns endpoint/model errors: verify endpoint base URL and provider compatibility type.
Configuration not visible where expected: check current workspace and your workspace permissions.
Calls blocked by policy: review applied AI Policies at organization/workspace/repository scope.
Last updated
Was this helpful?