S3Express is a command line software utility for Windows. Easily upload, query, backup files and folders to Amazon S3 storage, based upon multiple flexible criteria. Create custom batch scripts, list Amazon S3 files or entire folders, filter them with conditions, query, change object metadata and ACLs. Quickly upload only new or changed file using multipart uploads and concurrent threads.
S3Express Main Features
Very compact, very small footprint: the entire program is less than 5MB. Self-contained in one executable, S3Express does not require any additional libraries or software to be installed to run.
List S3 objects with conditional filters - manage S3 objects' metadata and ACLs - upload files using multipart uploads and multiple concurrent threads - upload only new or changed files (incremental backup) - delete multiple S3 objects - copy S3 objects - server-side and local encryption supported - and much more...
S3Express is ideal for batch scripts, automated uploads and backups to Amazon S3, and for performing custom queries on Amazon S3 objects.
All S3Express operations are multithreaded (fast), automatically retryable (network-failure resistant) and interruptible (= all commands can be stopped and restarted at any time). Connections to Amazon S3 are made using secure http (https), which is an encrypted version of the HTTP protocol, to protect your files while they're in transit to and from Amazon S3. You will not find many S3 command line tools that can do that!
Works on all versions of Windows including all Windows Servers.
It can easily handle several million files and files as large as 100GB or more in multipart mode.
Sold, actively supported and maintained by TGRMN Software.
It's Easy and Quick to Install. Download the free 21-day trial and start using S3Express today.
Latest Version: 1.5.10 - July 9, 2019 - Release History
System Requirements: All Windows supported, including Windows 10, Windows 8 (and 8.1), Windows 7, Windows Vista, Windows XP, Windows Server 2019, 2016, 2012, 2008 , 2003, incl. R2 releases.
Detailed Feature List
· All S3Express operations are multithreaded to achieve maximum speed.
· All S3Express operations are automatically retried on error (after X seconds and for X times, customizable)
· All S3Express operations are interruptible and then restartable by simply pressing the key 'ESC'.
· All S3Express connections to Amazon S3 are made using secure http (https) to protect your files while they're in transit to and from Amazon S3 servers.
· Command line variables supported.
· Scripting via the command line supported.
· Unicode compatible, i.e. S3Express supports all alphabet characters in the world.
List S3 Objects
· List objects in one or more S3 buckets and optionally show metadata and ACL for each object.
· List objects only in a specified subfolder or recursively list all objects in all subfolders.
· Optionally include all versions of an object in the listing for buckets that have versioning enabled.
· Include / exclude objects from the listing based on name, size, metadata, ACL, storage-class, encryption status, etc.
· Filter listing using regular expressions or basic wildcards.
· Show listing summary only and group S3 objects by extension, date, subfolder and more.
· For example, list all objects with 'cache-control' header not set or 'cache-control' header not equal to a specified value.
· For example, list only all public objects or only all private objects or only objects with a specified ACL.
· For example, list all object whose size is larger than a specified value.
Upload Files to S3
· Upload multiple files and whole directories to Amazon S3.
· Uploads are fully restartable in case of failure.
· Uploads can be automatically retried in case of an error (after X seconds and for X times, customizable).
· Optimized parallel file transfers (multiple threads) to speed-up uploads.
· Upload files using multipart uploads (with correct MD5 value) and multiple threads, so that large uploads can be restarted at any time from where they were left, if interrupted.
· Server-side and/or client-side file encryption supported.
· Keep the existing ACLs and/or metadata when overwriting existing S3 files.
· Throttle maximum bandwidth to use in Kilobytes per second.
· Select files to upload based on name, extension, size, subfolder, time, etc.
· Copy objects instead of re-uploading if a matching object is found on S3, so that renaming an object does not require re-uploading again.
· Move local files to Amazon S3 after successful upload, based on flexible criteria, e.g. file age, name, size etc.
Incremental Backup to S3
· Upload only new and changed files. Using this type of upload you can perform fast, incremental backups to S3: only those files that are new or have changed, compared to the files that are already on S3, are uploaded. And if a file is renamed locally, the corresponding S3 file is copied not re-uploaded to save time and bandwidth.
· Optionally, if a file is deleted locally, the corresponding S3 file can be removed or archived.
Manage S3 Metadata
· Show all or just specific object metadata.
· Set, reset, replace one or multiple objects' metadata.
· Preview all operations before proceeding.
· For example, set multiple objects' 'cache-control' header to a certain value.
· For example, apply server-side encryption to existing S3 objects.
Manage S3 ACLs
· Show all or specific object ACLs
· Set, reset, replace one or multiple objects' ACLs.
· Preview all operations before proceeding.
· For example, set multiple objects to public access.
· For example, set objects whose name starts with 'R' to private access.
· For example, make sure all objects in a folder or bucket are set to private access.
Delete S3 Files
· Delete one or multiple files from Amazon S3.
· Filter files to delete based on name, extension, size, ACL, metadata, time.
· Stop on first error.
· Preview before deleting.
· Multithreaded deletion for maximum speed.
· See the del (delete) command.
Copy S3 Files
· Copy Amazon S3 objects.
· Keep metadata and ACLs of copied objects.
· See the copy command.
Restore S3 Files from GLACIER
· Restore archived objects from GLACIER to S3.
· See the restore command.
... and much more!