Tutorial

ACS + Rclone Quickstart Guide

Learn how to upload and download files to ACS Object Storage using Rclone.

Overview

This guide shows you how to upload and download files to ACS Object Storage using Rclone.

Rclone is an open-source command-line tool that works with over 100 storage providers, including any S3-compatible system like ACS. With Rclone, you can quickly transfer large datasets, checkpoints, and logs, sync local directories with ACS buckets, and automate backups or workflows without extra SDKs.

Setup (One-Time)

1. Install Rclone

install.sh
# On Ubuntu/Debian
sudo apt update && sudo apt install rclone
# Verify installation
rclone version

2. Configure ACS Connection

config.sh
# Create config directory
mkdir -p ~/.config/rclone
# Create ACS configuration
cat > ~/.config/rclone/rclone.conf << 'EOF'
[acs]
type = s3
provider = Other
access_key_id = <YOUR-ACCESS-KEY-ID>
secret_access_key = <YOUR-SECRET-ACCESS-KEY>
endpoint = https://acceleratedprod.com
region = global
v2_auth = false
force_path_style = false
EOF

Replace <YOUR-ACCESS-KEY-ID> and <YOUR-SECRET-ACCESS-KEY> with your ACS credentials.

3. Test Connection

test.sh
# List available buckets
rclone lsd acs:

If successful, you'll see your ACS buckets.

Uploading Files

Transfer files from your local machine to ACS buckets. Use copy to upload while keeping the original files intact.

Single File Upload

Upload a single file to your ACS bucket with progress tracking and optimized transfer settings.

upload_file.sh
rclone copy /path/to/file.bin acs:your-bucket/ \
--ignore-checksum \
--progress \
--transfers 8 \
--stats 30s

Directory Upload

Upload an entire directory and its contents recursively with higher concurrency for faster transfers.

upload_dir.sh
rclone copy /path/to/directory/ acs:your-bucket/ \
--ignore-checksum \
--progress \
--transfers 16 \
--stats 30s

Downloading Files

Retrieve files from ACS buckets to your local machine. Downloads preserve file attributes and can resume interrupted transfers.

Single File Download

Download a specific file from your ACS bucket to a local directory.

download_file.sh
rclone copy acs:your-bucket/file.bin /local/path/ \
--ignore-checksum \
--progress \
--transfers 8 \
--stats 30s

Directory Download

Download an entire directory structure from ACS with all subdirectories and files.

download_dir.sh
rclone copy acs:your-bucket/ /path/to/directory/ \
--ignore-checksum \
--progress \
--transfers 16 \
--stats 30s

Listing Files and Directories

Explore and inspect your ACS storage contents. List commands help you navigate buckets and understand storage structure.

List Buckets

Show all available buckets in your ACS account.

list_buckets.sh
rclone lsd acs:

List Files in a Bucket

Display files and directories within a specific bucket using different detail levels.

list_files.sh
rclone ls acs:your-bucket/
rclone lsl acs:your-bucket/ # Detailed (size, date)
rclone lsd acs:your-bucket/ # Directories only

List Files with Filters

Find specific files using patterns, age, or size criteria to narrow down large datasets.

list_filtered.sh
rclone ls acs:your-bucket/ --include "*.pt"
rclone ls acs:your-bucket/ --max-age 7d
rclone ls acs:your-bucket/ --min-size 1G

Deleting Files and Directories

Remove unwanted files and directories from your ACS storage. Use caution with delete operations as they are permanent.

Delete Single File

Remove a specific file from your ACS bucket permanently.

delete_file.sh
rclone deletefile acs:your-bucket/path/to/file.bin

Delete Directory

Remove directories and their contents. Use 'delete' to keep the directory or 'purge' to remove everything.

delete_dir.sh
rclone delete acs:your-bucket/ # Delete contents only
rclone purge acs:your-bucket/ # Delete directory + contents

Delete with Filters

Selectively delete files based on patterns, age, or other criteria to clean up storage efficiently.

delete_filtered.sh
rclone delete acs:your-bucket/logs/ --include "*.log" --min-age 30d
rclone delete acs:your-bucket/ --include "*.tmp"
rclone rmdirs acs:your-bucket/ # Remove empty dirs

Safe Delete

Preview what will be deleted before executing. Always use --dry-run first to avoid accidental data loss.

safe_delete.sh
rclone delete acs:your-bucket/ --dry-run -v
rclone delete acs:your-bucket/ # Run for real

Syncing Local and Remote Directories

Keep directories synchronized between your local machine and ACS. Sync operations can be one-way or bidirectional.

One-Way Sync (Local → ACS)

Make ACS match your local directory exactly. Files not in the source will be deleted from ACS.

sync_one_way.sh
rclone sync /path/to/directory/ acs:your-bucket/ \
--ignore-checksum \
--progress \
--transfers 8 \
--stats 30s

One-Way Sync (ACS → Local)

Download and sync from ACS to your local directory. Local files will be updated to match ACS exactly.

sync_from_acs.sh
rclone sync acs:your-bucket/ /path/to/directory/ \
--ignore-checksum \
--progress \
--transfers 8 \
--stats 30s

Two-Way Sync

Synchronize changes in both directions. Files added, modified, or deleted on either side are reflected on the other, keeping both locations perfectly in sync.

sync_two_way.sh
rclone bisync /path/to/directory/ acs:your-bucket/ \
--ignore-checksum \
--progress

Sync with Exclusions

Synchronize directories while excluding specific file types or directories like temporary files and caches.

sync_exclude.sh
rclone sync /path/to/directory/ acs:your-bucket/ \
--ignore-checksum \
--progress \
--exclude "*.tmp" \
--exclude "*.log" \
--exclude ".cache/" \
--exclude "__pycache__/"

Transferring Data from Another Object Store to ACS

Migrate data directly between different cloud storage providers without downloading to local storage first.

Setup Multiple Storage Endpoints

Configure multiple storage providers in your Rclone config to enable cross-cloud transfers.

Configure both your source storage and ACS as separate remotes in rclone to transfer data between them.

multi_config.conf
# Add to ~/.config/rclone/rclone.conf
[source]
type = s3
provider = Other
access_key_id = YOUR_SOURCE_ACCESS_KEY
secret_access_key = YOUR_SOURCE_SECRET_KEY
endpoint = YOUR_SOURCE_ENDPOINT_URL
[acs]
type = s3
provider = Other
access_key_id = YOUR_ACS_ACCESS_KEY
secret_access_key = YOUR_ACS_SECRET_KEY
endpoint = https://acceleratedprod.com

Transfer Data Between Storage Systems

Copy data directly from one cloud provider to ACS without intermediate local storage.

transfer.sh
rclone copy source:bucket/path/ acs:my-bucket/destination/ \
--ignore-checksum \
--progress \
--transfers 8 \
--stats 30s

Performance Optimization

Tune Rclone settings for maximum transfer speed based on your file sizes and network conditions.

For Large Files (>10GB)

Optimize for large file transfers with larger buffers and multi-threaded streams for maximum throughput.

large_files.sh
rclone copy source dest \
--ignore-checksum \
--progress \
--transfers 8 \
--buffer-size 128M \
--multi-thread-streams 4 \
--stats 30s

For Many Small Files

Optimize for many small files with higher concurrency and more checker threads to reduce overhead.

small_files.sh
rclone copy source dest \
--ignore-checksum \
--progress \
--transfers 32 \
--checkers 64 \
--buffer-size 16M \
--stats 30s

Max Performance

Aggressive settings for maximum speed when network and system resources allow. Use with caution on constrained systems.

max_performance.sh
rclone copy source dest \
--ignore-checksum \
--progress \
--transfers 32 \
--checkers 64 \
--buffer-size 128M \
--multi-thread-streams 8 \
--use-mmap \
--stats 10s