S3 Gateway

Integrations

Using the S3 Gateway with popular tools and libraries

The Shelby S3 Gateway works with any S3-compatible tool or library. This guide covers popular integrations with step-by-step examples.

rclone

rclone is a command-line tool for managing files on cloud storage.

Configuration

Add a new remote to your rclone config (~/.config/rclone/rclone.conf):

~/.config/rclone/rclone.conf
[shelby]
type = s3
provider = Other
access_key_id = AKIAIOSFODNN7EXAMPLE
secret_access_key = wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
endpoint = http://localhost:9000
force_path_style = true
region = shelbyland

Or configure interactively:

rclone config

Usage Examples

# List all buckets
rclone lsd shelby:

# List files in a bucket (account)
rclone ls shelby:0x0694a79e492d268acf0c6c0b01f42654ac050071a343ebc4226cb6717d63e4ea

# Download a file
rclone copy shelby:0x0694.../path/to/file.txt ./local-folder/

# Sync a directory
rclone sync shelby:0x0694.../data/ ./local-backup/

DuckDB

DuckDB can query files directly from S3-compatible storage, including Parquet, CSV, and JSON files.

Configuration

-- Install and load the httpfs extension
INSTALL httpfs;
LOAD httpfs;

-- Configure S3 settings
SET s3_endpoint = 'localhost:9000';
SET s3_access_key_id = 'AKIAIOSFODNN7EXAMPLE';
SET s3_secret_access_key = 'wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY';
SET s3_use_ssl = false;
SET s3_url_style = 'path';
SET s3_region = 'shelbyland';

Query Examples

-- Query a Parquet file
SELECT * FROM read_parquet(
  's3://0x0694a79e492d268acf0c6c0b01f42654ac050071a343ebc4226cb6717d63e4ea/data/dataset.parquet'
);

-- Query a CSV file
SELECT * FROM read_csv(
  's3://0x0694.../reports/sales.csv',
  header = true
);

-- Query multiple files with glob pattern
SELECT * FROM read_parquet('s3://0x0694.../data/*.parquet');

AWS CLI

The AWS CLI can be used with the gateway by specifying a custom endpoint.

Configuration

Create a profile for Shelby in ~/.aws/credentials:

~/.aws/credentials
[shelby]
aws_access_key_id = AKIAIOSFODNN7EXAMPLE
aws_secret_access_key = wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY

And in ~/.aws/config:

~/.aws/config
[profile shelby]
region = shelbyland
output = json

Usage Examples

# List buckets
aws --profile shelby --endpoint-url http://localhost:9000 s3 ls

# List objects in a bucket
aws --profile shelby --endpoint-url http://localhost:9000 \
  s3 ls s3://0x0694a79e492d268acf0c6c0b01f42654ac050071a343ebc4226cb6717d63e4ea/

# Download a file
aws --profile shelby --endpoint-url http://localhost:9000 \
  s3 cp s3://0x0694.../file.txt ./file.txt

# Download with a specific path
aws --profile shelby --endpoint-url http://localhost:9000 \
  s3 cp s3://0x0694.../data/report.pdf ./downloads/

AWS SDK (JavaScript/TypeScript)

Use the AWS SDK with a custom endpoint configuration:

import { S3Client, ListBucketsCommand, GetObjectCommand } from "@aws-sdk/client-s3";

const client = new S3Client({
  endpoint: "http://localhost:9000",
  region: "shelbyland",
  credentials: {
    accessKeyId: "AKIAIOSFODNN7EXAMPLE",
    secretAccessKey: "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY",
  },
  forcePathStyle: true, // Required for non-AWS endpoints
});

// List buckets
const buckets = await client.send(new ListBucketsCommand({}));
console.log(buckets.Buckets);

// Get an object
const object = await client.send(new GetObjectCommand({
  Bucket: "0x0694a79e492d268acf0c6c0b01f42654ac050071a343ebc4226cb6717d63e4ea",
  Key: "path/to/file.txt",
}));

const content = await object.Body?.transformToString();

AWS SDK (Python - boto3)

import boto3

# Create S3 client
s3 = boto3.client(
    's3',
    endpoint_url='http://localhost:9000',
    aws_access_key_id='AKIAIOSFODNN7EXAMPLE',
    aws_secret_access_key='wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY',
    region_name='shelbyland',
)

# List buckets
response = s3.list_buckets()
for bucket in response['Buckets']:
    print(bucket['Name'])

# Download a file
s3.download_file(
    '0x0694a79e492d268acf0c6c0b01f42654ac050071a343ebc4226cb6717d63e4ea',
    'path/to/file.txt',
    'local-file.txt'
)

Cyberduck

Cyberduck is a GUI file browser for cloud storage.

Configuration

  1. Open Cyberduck and click Open Connection
  2. Select Amazon S3 from the dropdown
  3. Configure the connection:
FieldValue
Serverlocalhost
Port9000
Access Key IDAKIAIOSFODNN7EXAMPLE
Secret Access KeywJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
  1. Click More Options and set:

    • Uncheck Use SSL
    • Set Path Style to Path
  2. Click Connect

MinIO Client (mc)

The MinIO Client is another CLI option for S3-compatible storage.

Configuration

# Add the Shelby gateway as an alias
mc alias set shelby http://localhost:9000 \
  AKIAIOSFODNN7EXAMPLE \
  wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY

Usage Examples

# List buckets
mc ls shelby

# List objects
mc ls shelby/0x0694a79e492d268acf0c6c0b01f42654ac050071a343ebc4226cb6717d63e4ea

# Download a file
mc cp shelby/0x0694.../file.txt ./local-file.txt

# Get file info
mc stat shelby/0x0694.../file.txt

Troubleshooting

"InvalidAccessKeyId" Error

Ensure your access key ID matches exactly what's configured in shelby.config.ts.

"SignatureDoesNotMatch" Error

Common causes:

  • Region mismatch - ensure client region matches server.region in config
  • Clock skew - ensure your system clock is accurate (within 15 minutes)
  • Wrong secret key

"NoSuchBucket" Error

The bucket (Shelby account address) is not configured in your bucketProvider. Add the account address to your shelby.config.ts.

Connection Refused

Ensure the gateway is running (pnpm dev) and listening on the expected port.

SSL/TLS Errors

For local development, use http:// instead of https://, or run the gateway with HTTPS enabled (see Configuration).