githubEdit

message-smileConversation (Chat)

The Conversation APIs enable you to create, manage, and interact with AI conversations within the Flashback platform. These APIs provide the foundation for AI-powered chat interactions, allowing users to have multi-turn conversations with AI models through repositories.

Overview

AI Conversations are contextual containers that manage interactions between users and AI models. Each conversation is associated with a repository and maintains a complete history of user prompts and AI responses, enabling context-aware, multi-turn dialogues.

Key Concepts

  • Conversations: Container for a series of AI interactions

  • Messages: Individual prompts (user) and responses (assistant) within a conversation

  • Repository Association: Each conversation is tied to a specific repository

  • Context Management: Conversations maintain context across multiple turns

  • Policy Enforcement: All interactions are subject to AI governance policies

Available Endpoints

Conversation Management

Method
API Reference
Description

POST/conversation

Create a new conversation.

GET/conversation

List conversations with filtering and pagination.

Conversation Interaction

Method
API Reference
Description

POST/conversation/{conversationId}/prompt

Send a prompt to a conversation.

GET/conversation/{conversationId}/messages

Get all messages from a conversation.

Common Use Cases

1. Creating and Using a Conversation

2. Listing Conversations

3. Multi-Turn Conversation

Access Control

Conversation access is controlled by workspace permissions:

Organization Administrators

  • Can view all conversations in their organization

  • Can filter by any workspace or repository

  • Full access to conversation history

Workspace Administrators

  • Can view all conversations in their managed workspaces

  • Can filter conversations within their workspace scope

  • Can view messages from conversations in their workspaces

Regular Users

  • Can only view their own conversations

  • Can create conversations in accessible workspaces

  • Can send prompts to their own conversations

  • Limited to conversations they created

Integration with External Services

The conversation APIs integrate with external conversation API engines to:

  • Process Prompts: Send user prompts to AI providers and receive responses

  • Retrieve Messages: Fetch conversation history from the conversation store

  • Maintain Context: Manage conversation context across multiple turns

  • Track Usage: Monitor token consumption and API usage

Note: The prompt sending and message retrieval endpoints require integration with external conversation API engines. These integrations handle the actual AI interactions and message storage.

Token Tracking

Conversations automatically track token usage:

  • tokensIn: Total input tokens (user prompts)

  • tokensOut: Total output tokens (AI responses)

Token counts are updated with each interaction and can be used for:

  • Usage monitoring and billing

  • Cost analysis

  • Performance optimization

  • Resource planning

Policy Enforcement

All conversation interactions are subject to AI governance policies:

  • Pre-Processing: Policies are evaluated before sending prompts

  • During Processing: Real-time policy checks during AI interactions

  • Post-Processing: Response validation against policies

  • Violation Tracking: Policy violations are logged and tracked

Policy actions (log, alert, block) are applied based on policy configuration.

Best Practices

1. Conversation Lifecycle

2. Error Handling

3. Pagination

TypeScript Client Library

The Flashback TypeScript client provides convenient methods for all conversation operations:

Next Steps

  1. Create your first conversation for a repository

  2. Send prompts and interact with AI models

  3. Retrieve conversation history

  4. Monitor token usage and costs

  5. Set up policies to govern AI interactions

Last updated

Was this helpful?