post__ai_llm
⚠️ TEST ENVIRONMENT ONLY
POST /ai/llm
Create AI LLM Configuration
Create a new AI/LLM provider configuration for your workspace. This endpoint allows you to configure connections to various AI and Large Language Model providers.
Supported AI Providers:
OPENAI: OpenAI (GPT-4, GPT-3.5, etc.)GOOGLE: Google AI (Gemini, PaLM, etc.)ANTHROPIC: Anthropic (Claude models)AWS: Amazon Bedrock and AWS AI servicesOTHER: Custom or other AI providers
Key Features:
Centralized AI provider credential management
Support for multiple AI providers per workspace
Encrypted storage of API keys and secrets
Integration with Flashback repositories for AI-powered features
Security:
API keys and secrets are encrypted before storage
Credentials are never returned in API responses (only masked values)
Workspace-level access controls apply
Validation:
Configuration name must be unique within your workspace
Endpoint URL format is validated
Credentials can be validated after creation using the validate endpoint
TypeScript Client Library
public createAiLlm = async (data: CreateAiLlmRequest): Promise<CreateAiLlmResponse> => {
return this.makeRequest<CreateAiLlmResponse>('ai/llm', 'POST', data);
};Code Samples
# You can also use wget
curl -X POST https://backend.flashback.tech/ai/llm \
-H 'Content-Type: application/json' \
-H 'Accept: application/json' \
-H 'Authorization: Bearer {access-token}'POST https://backend.flashback.tech/ai/llm HTTP/1.1
Host: backend.flashback.tech
Content-Type: application/json
Accept: application/jsonconst inputBody = '{
"name": "My OpenAI Config",
"aiType": "OPENAI",
"endpoint": "https://api.openai.com/v1",
"secret": "sk-proj-xxxxxxxxxxxx",
"workspaceId": "workspace-123"
}';
const headers = {
'Content-Type':'application/json',
'Accept':'application/json',
'Authorization':'Bearer {access-token}'
};
fetch('https://backend.flashback.tech/ai/llm',
{
method: 'POST',
body: inputBody,
headers: headers
})
.then(function(res) {
return res.json();
}).then(function(body) {
console.log(body);
});require 'rest-client'
require 'json'
headers = {
'Content-Type' => 'application/json',
'Accept' => 'application/json',
'Authorization' => 'Bearer {access-token}'
}
result = RestClient.post 'https://backend.flashback.tech/ai/llm',
params: {
}, headers: headers
p JSON.parse(result)import requests
headers = {
'Content-Type': 'application/json',
'Accept': 'application/json',
'Authorization': 'Bearer {access-token}'
}
r = requests.post('https://backend.flashback.tech/ai/llm', headers = headers)
print(r.json())<?php
require 'vendor/autoload.php';
$headers = array(
'Content-Type' => 'application/json',
'Accept' => 'application/json',
'Authorization' => 'Bearer {access-token}',
);
$client = new \GuzzleHttp\Client();
// Define array of request body.
$request_body = array();
try {
$response = $client->request('POST','https://backend.flashback.tech/ai/llm', array(
'headers' => $headers,
'json' => $request_body,
)
);
print_r($response->getBody()->getContents());
}
catch (\GuzzleHttp\Exception\BadResponseException $e) {
// handle exception or api errors.
print_r($e->getMessage());
}
// ...URL obj = new URL("https://backend.flashback.tech/ai/llm");
HttpURLConnection con = (HttpURLConnection) obj.openConnection();
con.setRequestMethod("POST");
int responseCode = con.getResponseCode();
BufferedReader in = new BufferedReader(
new InputStreamReader(con.getInputStream()));
String inputLine;
StringBuffer response = new StringBuffer();
while ((inputLine = in.readLine()) != null) {
response.append(inputLine);
}
in.close();
System.out.println(response.toString());package main
import (
"bytes"
"net/http"
)
func main() {
headers := map[string][]string{
"Content-Type": []string{"application/json"},
"Accept": []string{"application/json"},
"Authorization": []string{"Bearer {access-token}"},
}
data := bytes.NewBuffer([]byte{jsonReq})
req, err := http.NewRequest("POST", "https://backend.flashback.tech/ai/llm", data)
req.Header = headers
client := &http.Client{}
resp, err := client.Do(req)
// ...
}Body parameter
{
"name": "My OpenAI Config",
"aiType": "OPENAI",
"endpoint": "https://api.openai.com/v1",
"key": "optional-access-key",
"secret": "sk-proj-xxxxxxxxxxxx",
"workspaceId": "workspace-123"
}Parameters
body
body
object
true
none
» name
body
string
true
Human-readable name for the AI LLM configuration
» aiType
body
string
true
Type of AI provider
» endpoint
body
string
true
API endpoint URL for the AI provider
» key
body
string
false
Access key or API key (optional, provider-dependent)
» secret
body
string
true
Secret key or API secret for authentication
» workspaceId
body
string
true
Workspace ID this configuration belongs to
Enumerated Values
» aiType
OPENAI
» aiType
» aiType
ANTHROPIC
» aiType
AWS
» aiType
OTHER
Example responses
200 Response
{
"success": true,
"aiLlmId": "550e8400-e29b-41d4-a716-446655440000",
"message": "AI LLM configuration created successfully"
}Responses
Response Schema
Status Code 200
» success
boolean
false
none
Operation success status
» aiLlmId
string
false
none
Unique identifier for the created AI LLM config
» message
string
false
none
Success message
Status Code 400
» success
boolean
false
none
none
» message
string
false
none
none
Status Code 403
» success
boolean
false
none
none
» message
string
false
none
none
Status Code 500
» success
boolean
false
none
none
» message
string
false
none
none
To perform this operation, you must be authenticated by means of one of the following methods: BearerAuth
Last updated
Was this helpful?