githubEdit

bullseye-arrowQuickstart with Our Gateway

triangle-exclamation
circle-info

This quickstart focuses on AWS-first examples for both Cloud Storage and AI LLM workflows. You can adapt the same setup pattern to other compatible providers supported by Flashback.

Use these two micro tutorials:


Start Quickly with Cloud Storage

Follow this flow to upload your first object through Flashback.

Prerequisites

  • Python 3.9+

  • boto3 installed

  • A Flashback workspace with access to Storage, Repositories, and API Keys

  • An AWS S3 bucket you can connect (direct credentials or delegated access)

1

Sign in

Sign in at platform.flashback.techarrow-up-right with your account.

circle-info

If you do not have an account yet, you can request a demoarrow-up-right.

2

Configure your first bucket

You can use delegated access (recommended) or direct credentials for a quick test.

  • Go to StorageBuckets.

  • Click Add Bucket.

  • Fill in the bucket form:

    • Name: My First Bucket

    • Storage Type: S3

    • Bucket: your exact AWS S3 bucket name

    • Access Key / Secret Key: AWS access key pair (or delegated role info)

    • Endpoint: leave empty for AWS, or set custom S3 endpoint

    • Region: required when endpoint is empty

  • Click Create.

circle-info

Detailed guide: Configure a Bucket.

3

Create a repository

  • Go to Repositories.

  • Click Add Repository.

  • Fill:

    • Name: my-first-s3-repository

    • API Type: S3

  • In Storage Buckets, attach My First Bucket.

  • Click Create.

circle-info

Detailed guide: Create a Repository.

4

Generate an API key

In your repository:

  • Open Inventory in Repositories. Choose your newly created repository.

  • Look for the API Key Table,and Click Add, and select Buckets.

  • Set a label such as quickstart-write.

  • Choose access mode WRITE/READ/ADMIN (read+write).

  • Copy the secret immediately (it is shown once).

5

Run storage snippets (reviewed)

Install dependency:

pip install boto3

Create a shared config file:

# flashback_s3_config.py
import os
import boto3
from botocore.client import Config

FLASHBACK_S3_ENDPOINT = os.environ["FLASHBACK_S3_ENDPOINT"]
FLASHBACK_S3_KEY_ID = os.environ["FLASHBACK_S3_KEY_ID"]
FLASHBACK_S3_SECRET = os.environ["FLASHBACK_S3_SECRET"]

session = boto3.session.Session(
    aws_access_key_id=FLASHBACK_S3_KEY_ID,
    aws_secret_access_key=FLASHBACK_S3_SECRET,
)

s3_client = session.client(
    service_name="s3",
    endpoint_url=FLASHBACK_S3_ENDPOINT,
    config=Config(signature_version="s3v4"),
)

List buckets exposed by your repository:

# list_buckets.py
from flashback_s3_config import s3_client

response = s3_client.list_buckets()
print("Buckets available through Flashback:")
for bucket in response.get("Buckets", []):
    print(f" - {bucket['Name']}")

Upload a file:

# upload_file.py
import os
from pathlib import Path
from flashback_s3_config import s3_client

bucket_name = os.environ["FLASHBACK_BUCKET_NAME"]
file_path = Path(os.environ["LOCAL_FILE_PATH"])
object_key = file_path.name

s3_client.upload_file(
    Filename=str(file_path),
    Bucket=bucket_name,
    Key=object_key,
)

print(f"Uploaded '{object_key}' to bucket '{bucket_name}'.")

Example environment variables:

export FLASHBACK_S3_ENDPOINT="https://s3-us-east-1.aws.flashback.tech"
export FLASHBACK_S3_KEY_ID="<your-repository-api-key-id>"
export FLASHBACK_S3_SECRET="<your-repository-api-secret>"
export FLASHBACK_BUCKET_NAME="<bucket-attached-to-repository>"
export LOCAL_FILE_PATH="./sample.txt"

Start Quickly with AI LLM

This quickstart shows an AWS-oriented AI setup with Flashback acting as the unified gateway.

Prerequisites

  • Python 3.9+

  • openai SDK installed

  • A Flashback workspace with access to AI LLM, Repositories, and API Keys

  • AWS model access enabled in your account (for example via Amazon Bedrock)

1

Add an AI LLM provider (AWS)

  • Go to AIAI LLM.

  • Click Add AI LLM.

  • Select AI LLM Type: AWS.

  • Fill:

    • Configuration Name: aws-llm-primary

    • API Endpoint: your AWS-compatible endpoint used by Flashback

    • API Secret: credential/token used by that endpoint

    • API Key: fill only if your endpoint requires it

  • Click Create configuration.

circle-info

Detailed guide: Configure an AI LLM.

2

Create an AI repository

  • Go to RepositoriesAdd Repository.

  • Set:

    • Name: my-first-ai-repository

    • API Type: AI

  • In provider/resource selection, attach aws-llm-primary.

  • Save the repository.

3

Generate an AI API key

In the AI repository:

  • Open Inventory in Repositories. Choose your newly created repository.

  • Look for the API Key Table,and Click Add, and select AI LLMs.

  • Set a label such as quickstart-write.

  • Copy and store the secret immediately.

4

Run AI snippets (reviewed)

Install dependency:

Call the repository endpoint with the OpenAI-compatible SDK:

Example environment variables:

circle-exclamation

Last updated

Was this helpful?