post__ai_llm_{aillmId}_validate
⚠️ TEST ENVIRONMENT ONLY
POST /ai/llm/{id}/validate
Validate AI LLM Configuration
Test and validate an AI/LLM provider configuration by making a test API call to the configured endpoint. This endpoint verifies that the credentials and configuration are correct and working.
Validation Process:
Makes a test request to the configured AI provider endpoint
Verifies credentials (API key/secret) are valid
Confirms the endpoint is reachable and responding correctly
Returns detailed feedback about the validation result
Use Cases:
Verify configuration after creation or update
Troubleshoot connection issues
Confirm credentials are still valid
Test endpoint connectivity before using in production
Important Notes:
This operation makes an actual API call to the AI provider
May consume a small amount of tokens/credits from your AI provider account
Validation results are returned in the response message
Does not modify the configuration, only tests it
TypeScript Client Library
Code Samples
Parameters
id
path
string
true
Unique identifier of the AI LLM configuration
Example responses
200 Response (Success)
200 Response (Failure)
Responses
Response Schema
Status Code 200
» success
boolean
true
none
Indicates if validation was successful
» message
string
true
none
Detailed validation result message
Note: A 200 status code indicates the validation operation completed successfully. Check the success field to determine if the AI configuration itself is valid. A false value with a 200 status means the validation ran but found issues with the configuration.
Status Code 403
» success
boolean
false
none
none
» message
string
false
none
none
Status Code 404
» success
boolean
false
none
none
» message
string
false
none
none
Status Code 500
» success
boolean
false
none
none
» message
string
false
none
none
To perform this operation, you must be authenticated by means of one of the following methods: BearerAuth
Last updated
Was this helpful?