post__conversation_{conversationId}_prompt

⚠️ TEST ENVIRONMENT ONLY

This API endpoint is currently available only in the TEST environment. It is not yet available in production.

POST /conversation/{conversationId}/prompt

Send Prompt to Conversation

Send a user prompt to an AI conversation. This endpoint processes the prompt through the configured AI provider and returns the assistant's response, maintaining conversation context for multi-turn interactions.

Key Features:

  • Sends user prompts to the AI conversation engine

  • Maintains conversation context across multiple turns

  • Integrates with repository AI API keys for authentication

  • Enforces AI policies and governance rules

  • Tracks token usage and updates conversation statistics

  • Returns assistant responses in real-time or streaming format

Important Notes:

  • Users must have access to the conversation's workspace

  • The conversation must exist and not be deleted

  • Policy enforcement occurs during prompt processing

  • Token usage is automatically tracked and updated

  • Responses are subject to policy validation (log, alert, or block)

Security:

  • Access is validated against workspace permissions

  • Only users with workspace read access can send prompts

  • Policies are evaluated before and during prompt processing

  • Violations are logged and tracked for compliance

Integration:

This endpoint integrates with external conversation API engines to process prompts and generate responses. The system handles:

  • Routing to the appropriate AI provider

  • Context management across conversation turns

  • Policy enforcement and violation detection

  • Token counting and usage tracking

TypeScript Client Library

public sendPrompt = async (
  conversationId: string,
  data: SendPromptRequest
): Promise<SendPromptResponse> => {
  return this.makeRequest<SendPromptResponse>(`conversation/${conversationId}/prompt`, 'POST', data);
};

Code Samples

# You can also use wget
curl -X POST https://backend.flashback.tech/conversation/{conversationId}/prompt \
  -H 'Content-Type: application/json' \
  -H 'Accept: application/json' \
  -H 'Authorization: Bearer {access-token}'

Body parameter

{
  "prompt": "What is the best practice for securing API keys?"
}

Parameters

Name
In
Type
Required
Description

conversationId

path

string

true

Unique identifier of the conversation

body

body

object

true

none

» prompt

body

string

true

User prompt text to send to the AI

Example responses

200 Response

{
  "success": true,
  "message": "Prompt sent successfully"
}

Responses

Status
Meaning
Description
Schema

200

Prompt sent successfully

Inline

400

Validation error or invalid input

Inline

403

Insufficient permissions

Inline

404

Conversation not found

Inline

500

Failed to send prompt

Inline

Response Schema

Status Code 200

Name
Type
Required
Restrictions
Description

» success

boolean

false

none

Operation success status

» message

string

false

none

Success message

Note: Additional response fields may be added when the external API integration is fully implemented, such as response content, token counts, model information, and streaming metadata.

Status Code 400

Name
Type
Required
Restrictions
Description

» success

boolean

false

none

none

» message

string

false

none

Error message

Status Code 403

Name
Type
Required
Restrictions
Description

» success

boolean

false

none

none

» message

string

false

none

Error message

Status Code 404

Name
Type
Required
Restrictions
Description

» success

boolean

false

none

none

» message

string

false

none

Error message

Status Code 500

Name
Type
Required
Restrictions
Description

» success

boolean

false

none

none

» message

string

false

none

Error message

To perform this operation, you must be authenticated by means of one of the following methods: BearerAuth

Last updated

Was this helpful?