Conversation API
The Conversation APIs enable you to create, manage, and interact with AI conversations within the Flashback platform. These APIs provide the foundation for AI-powered chat interactions, allowing users to have multi-turn conversations with AI models through repositories.
Overview
AI Conversations are contextual containers that manage interactions between users and AI models. Each conversation is associated with a repository and maintains a complete history of user prompts and AI responses, enabling context-aware, multi-turn dialogues.
Key Concepts
Conversations: Container for a series of AI interactions
Messages: Individual prompts (user) and responses (assistant) within a conversation
Repository Association: Each conversation is tied to a specific repository
Context Management: Conversations maintain context across multiple turns
Policy Enforcement: All interactions are subject to AI governance policies
Available Endpoints
Conversation Management
POST /conversation - Create a new conversation
GET /conversation - List conversations with filtering and pagination
Conversation Interaction
POST /conversation/{conversationId}/prompt - Send a prompt to a conversation
GET /conversation/{conversationId}/messages - Get all messages from a conversation
Common Use Cases
1. Creating and Using a Conversation
// Create a new conversation
const conversation = await client.createConversation({
repoId: "repo-123"
});
console.log(`Created conversation: ${conversation.conversationId}`);
// Send a prompt to the conversation
const response = await client.sendPrompt(conversation.conversationId, {
prompt: "What is the best practice for securing API keys?"
});
// Retrieve conversation messages
const messages = await client.getConversationMessages(conversation.conversationId);
messages.messages.forEach(msg => {
console.log(`${msg.role}: ${msg.content}`);
});2. Listing Conversations
// Get all conversations in a workspace
const conversations = await client.getConversations({
workspaceId: "workspace-123",
take: 50,
skip: 0
});
// Filter by date range
const recentConversations = await client.getConversations({
workspaceId: "workspace-123",
from: "2024-01-01",
to: "2024-01-31"
});
// Get conversations for a specific user
const userConversations = await client.getConversations({
userId: "user-456"
});
// Display conversations
conversations.conversations.forEach(conv => {
console.log(`Conversation ${conv.id}`);
console.log(` Created: ${conv.createdAt}`);
console.log(` Tokens In: ${conv.tokensIn}, Tokens Out: ${conv.tokensOut}`);
console.log(` Creator: ${conv.creator?.name} ${conv.creator?.lastName}`);
});3. Multi-Turn Conversation
const conversationId = "conv-123";
// First turn
await client.sendPrompt(conversationId, {
prompt: "Explain REST API principles"
});
// Second turn (conversation maintains context)
await client.sendPrompt(conversationId, {
prompt: "How does that apply to microservices?"
});
// Third turn
await client.sendPrompt(conversationId, {
prompt: "Give me a practical example"
});
// Retrieve full conversation history
const messages = await client.getConversationMessages(conversationId);
console.log(`Conversation has ${messages.messages.length} messages`);Access Control
Conversation access is controlled by workspace permissions:
Organization Administrators
Can view all conversations in their organization
Can filter by any workspace or repository
Full access to conversation history
Workspace Administrators
Can view all conversations in their managed workspaces
Can filter conversations within their workspace scope
Can view messages from conversations in their workspaces
Regular Users
Can only view their own conversations
Can create conversations in accessible workspaces
Can send prompts to their own conversations
Limited to conversations they created
Integration with External Services
The conversation APIs integrate with external conversation API engines to:
Process Prompts: Send user prompts to AI providers and receive responses
Retrieve Messages: Fetch conversation history from the conversation store
Maintain Context: Manage conversation context across multiple turns
Track Usage: Monitor token consumption and API usage
Note: The prompt sending and message retrieval endpoints require integration with external conversation API engines. These integrations handle the actual AI interactions and message storage.
Token Tracking
Conversations automatically track token usage:
tokensIn: Total input tokens (user prompts)
tokensOut: Total output tokens (AI responses)
Token counts are updated with each interaction and can be used for:
Usage monitoring and billing
Cost analysis
Performance optimization
Resource planning
Policy Enforcement
All conversation interactions are subject to AI governance policies:
Pre-Processing: Policies are evaluated before sending prompts
During Processing: Real-time policy checks during AI interactions
Post-Processing: Response validation against policies
Violation Tracking: Policy violations are logged and tracked
Policy actions (log, alert, block) are applied based on policy configuration.
Best Practices
1. Conversation Lifecycle
// Create conversation
const conv = await client.createConversation({ repoId: "repo-123" });
// Use conversation
await client.sendPrompt(conv.conversationId, { prompt: "Hello" });
// Retrieve history when needed
const messages = await client.getConversationMessages(conv.conversationId);
// List conversations for management
const allConvs = await client.getConversations({ workspaceId: "workspace-123" });2. Error Handling
try {
await client.sendPrompt(conversationId, { prompt: "Test prompt" });
} catch (error) {
if (error.status === 403) {
console.error("Access denied: Check workspace permissions");
} else if (error.status === 404) {
console.error("Conversation not found");
} else if (error.status === 500) {
console.error("Server error: Try again later");
}
}3. Pagination
async function getAllConversations(workspaceId: string) {
const allConversations = [];
let skip = 0;
const take = 50;
while (true) {
const response = await client.getConversations({
workspaceId,
take,
skip
});
allConversations.push(...response.conversations);
if (response.conversations.length < take) {
break; // No more conversations
}
skip += take;
}
return allConversations;
}TypeScript Client Library
The Flashback TypeScript client provides convenient methods for all conversation operations:
import { FlashbackClient } from '@flashbacktech/flashbackclient';
const client = new FlashbackClient({
apiKey: 'your-api-key',
baseUrl: 'https://backend.flashback.tech'
});
// Create conversation
const conversation = await client.createConversation({
repoId: "repo-123"
});
// Send prompt
await client.sendPrompt(conversation.conversationId, {
prompt: "Your question here"
});
// Get conversations
const conversations = await client.getConversations({
workspaceId: "workspace-123"
});
// Get messages
const messages = await client.getConversationMessages("conversation-id");Related Documentation
AI LLM Management APIs - Configure AI provider connections
AI API Keys - Manage repository-specific API keys
AI Policy APIs - Set up governance policies for AI interactions
Repository APIs - Manage repositories
Next Steps
Create your first conversation for a repository
Send prompts and interact with AI models
Retrieve conversation history
Monitor token usage and costs
Set up policies to govern AI interactions
Last updated
Was this helpful?