Upload/Download a File

This guide demonstrates how to interact with the Flashback Platform to upload and download files in your application backend:

Prerequisites

  • Python 3.9+

  • A Flashback Repository with a valid API key (READ or WRITE) – see Create a Repository.

  • Network access to your Bridge endpoints.

Install Required Packages

pip install boto3 google-cloud-storage azure-storage-blob

AWS S3

1

Configuration

Set up the S3 client with your Flashback endpoint and credentials:

# flashback_aws_config.py
import boto3
from botocore.client import Config

# Replace with your Flashback credentials and endpoint
ENDPOINT = "https://s3-us-east-1-aws.flashback.tech"
# Consider READ/WRITE API Information of the Repository
API_KEY_ID = "YOUR_API_KEY_ID" 
API_SECRET = "YOUR_API_SECRET"

session = boto3.session.Session(
    aws_access_key_id=API_KEY_ID,
    aws_secret_access_key=API_SECRET
)

s3_client = session.client(
    service_name="s3",
    endpoint_url=ENDPOINT,
    config=Config(signature_version="s3v4")
)
2

Upload a File

Upload a local file to your specified bucket:

# aws_upload.py
from flashback_aws_config import s3_client

#S3 Bucket, GCS Bucket, or Azure Container listed in your Repository
BUCKET_NAME = "your-bucket-name"
FILE_PATH = "path/to/local/file.txt"
OBJECT_NAME = FILE_PATH.split("/")[-1]

s3_client.upload_file(
    Filename=FILE_PATH,
    Bucket=BUCKET_NAME,
    Key=OBJECT_NAME
)
print(f"Uploaded {OBJECT_NAME} to {BUCKET_NAME}")
3

Download a File

Download an object from your repository to a local path:

# aws_download.py
from flashback_aws_config import s3_client

#S3 Bucket, GCS Bucket, or Azure Container listed in your Repository
BUCKET_NAME = "your-bucket-name"
OBJECT_NAME = "file.txt"
DEST_PATH = "downloads/file.txt"

s3_client.download_file(
    Bucket=BUCKET_NAME,
    Key=OBJECT_NAME,
    Filename=DEST_PATH
)
print(f"Downloaded {OBJECT_NAME} to {DEST_PATH}")

Google Cloud Storage

1

Configuration

Initialize a GCS client pointing to your Flashback Bridge endpoint:

# flashback_gcs_config.py
from google.cloud import storage
from google.oauth2 import service_account

# Replace with your Flashback credentials and endpoint
ENDPOINT = "https://s3-us-east-1.aws.flashback.tech"
# Consider READ/WRITE API Information of the Repository
CLIENT_EMAIL = "YOUR_CLIENT_EMAIL"
PRIVATE_KEY = "YOUR_PRIVATE_KEY"

credentials = service_account.Credentials.from_service_account_info({
    "type": "service_account",
    "client_email": CLIENT_EMAIL,
    "private_key": PRIVATE_KEY,
})

client = storage.Client(
    credentials=credentials,
    client_options={"api_endpoint": ENDPOINT}
)
2

Upload a File as a Blob

Upload a blob to the specified GCS bucket:

# gcs_upload.py
from flashback_gcs_config import client

#S3 Bucket, GCS Bucket, or Azure Container listed in your Repository
BUCKET_NAME = "your-flashback-bucket-name"
FILE_PATH = "path/to/local/file.txt"
OBJECT_NAME = FILE_PATH.split("/")[-1]

bucket = client.bucket(BUCKET_NAME)
blob = bucket.blob(OBJECT_NAME)
blob.upload_from_filename(FILE_PATH)
print(f"Uploaded {OBJECT_NAME} to {BUCKET_NAME}")
3

Download a File as a Blob

Download a blob from your repository:

# gcs_download.py
from flashback_gcs_config import client

#S3 Bucket, GCS Bucket, or Azure Container listed in your Repository
BUCKET_NAME = "your-bucket-name"
OBJECT_NAME = "file.txt"
DEST_PATH = "downloads/file.txt"

bucket = client.bucket(BUCKET_NAME)
blob = bucket.blob(OBJECT_NAME)
blob.download_to_filename(DEST_PATH)
print(f"Downloaded {OBJECT_NAME} to {DEST_PATH}")

Azure Blob Storage

1

Configuration

Create an Azure BlobServiceClient using your Flashback endpoint:

# flashback_azure_config.py
from azure.storage.blob import BlobServiceClient

# Replace with your Flashback endpoint and credentials
ENDPOINT = "https://s3-us-east-1.aws.flashback.tech"
# Consider READ/WRITE API Information of the Repository
CREDENTIAL = "YOUR_CREDENTIAL"

client = BlobServiceClient(
    account_url=ENDPOINT,
    credential=CREDENTIAL
)
2

Upload a File as a Blob

Upload a local file as a blob to your bucket:

# azure_upload.py
from flashback_azure_config import client

#S3 Bucket, GCS Bucket, or Azure Container listed in your Repository
CONTAINER_NAME = "your-bucket_name"
FILE_PATH = "path/to/local/file.txt"
BLOB_NAME = FILE_PATH.split("/")[-1]

blob_client = client.get_blob_client(container=CONTAINER_NAME, blob=BLOB_NAME)
with open(FILE_PATH, "rb") as data:
    blob_client.upload_blob(data)
print(f"Uploaded {BLOB_NAME} to {CONTAINER_NAME}")
3

Download a File as a Blob

Download a blob from your container to a local path:

# azure_download.py
from flashback_azure_config import client

#S3 Bucket, GCS Bucket, or Azure Container listed in your Repository
CONTAINER_NAME = "your-flashback-bucket_name"
BLOB_NAME = "file.txt"
DEST_PATH = "downloads/file.txt"

blob_client = client.get_blob_client(container=CONTAINER_NAME, blob=BLOB_NAME)
with open(DEST_PATH, "wb") as file:
    data = blob_client.download_blob()
    file.write(data.readall())
print(f"Downloaded {BLOB_NAME} to {DEST_PATH}")

Next Steps

  • Explore additional storage API operations: delete, copy, multipart uploads, etc.

  • Integrate and modify these snippets into your applications.

Last updated

Was this helpful?