Examples

Top  Previous  Next

Following are some examples showing S3Express functionality.

Buckets

Create a bucket.

mkbkt mybucket

Remove a bucket (bucket must be empty).

rmbkt mybucket

Empty a bucket (remove all objects within bucket).

del mybucket/*  (or cd mybucket followed by del *)

List Objects

List all buckets.

ls

List all objects in 'mybucket'.

ls mybucket  (or cd mybucket followed by just ls)

List all objects in 'mybucket' including in all subfolders.

ls mybucket -s  (or cd mybucket followed by just ls -s)

List all objects in 'mybucket', including in all subfolders, that have .txt extension. Include MD5 values in the output.

ls mybucket/*.txt -s -md5

List all objects in 'mybucket', including in all subfolders, that have .txt extension. Include MD5 values, metadata and ACL in the output, in extended format (-ext). Note that showing metadata and/or ACL is slower as each object must be queried separately.

ls mybucket/*.txt -s -md5 -showmeta -showacl -ext

List all objects in 'mybucket', including in all subfolders, that are server-side encrypted.

ls mybucket -s -inclenc

List all objects in 'mybucket', including in all subfolders, that are server-side encrypted.

ls mybucket -s -inclenc

List all objects in 'mybucket', but not in subfolders, that have the header cache-control:max-age set to 60. Show metadata in the output (-showmeta).

ls mybucket -cond:"extract_value(cache-control,'max-age')=60" -showmeta

List all objects in 'mybucket', subfolders 'mysubfolder', that do not have the cache-control header. Show metadata in the output (-showmeta).

ls mybucket/mysubfolder/ -cond:"cache-control=''" -showmeta

List all objects in 'mybucket', subfolders 'mysubfolder', that have the cache-control header. Show metadata in the output (-showmeta).

ls mybucket/mysubfolder/ -cond:"cache-control !='' " -showmeta

List all objects in 'mybucket', subfolders 'mysubfolder', that are larger than 5 Megabytes.

ls mybucket/mysubfolder/ -cond:"s3_sizeMB>5"

List all objects in 'mybucket', subfolders 'mysubfolder', that are larger than 5 Megabytes and have extension .txt or .gif or .jpg

ls mybucket/mysubfolder/ -cond:"s3_sizeMB>5" -include:"*.txt|*.jpg|*.gif"

List all objects in 'mybucket', including in all subfolders, that do not start with a, b or c (using regular expressions).

ls mybucket -s -rexclude:"^(a.*)|(b.*)|(c.*)"

Note that in the example above -rexclude uses the object name to match. To match against the entire object path, use the s3_path variable, e.g.

ls mybucket -s -cond:"s3_path regex_matches '^(a.*)|(b.*)|(c.*)' = false"

List all objects in 'mybucket', including in all subfolders, that are not 'private' ('private' means that owner gets FULL_CONTROL and no one else has access rights).

ls mybucket -s -cond:"s3_acl_is_private = false"

List all objects in 'mybucket', including in all subfolders, that are not 'public-read' ('public-read' means that owner gets FULL_CONTROL and the AllUsers group gets READ access).

ls mybucket -s -cond:"s3_acl_is_public_read = false"

List all objects in 'mybucket', including in all subfolders, and group output by object's extension.

ls mybucket -s -grp:ext

Show a summary of all objects in 'mybucket', including in all subfolders, which have the cache-control:max-age value greater than 0, and group the output by cache-control header value. Do not show each object, just a summary (-sum parameter).

ls mybucket -s -sum -grp:cache-control -cond:"extract_value(cache-control,'max-age')>0"

Put Objects (Uploads)

Upload all files that are in c:\folder\ to mybucket.

put c:\folder\ mybucket

Upload file c:\folder\file.txt to mybucket/subfolder

put c:\folder\file.txt mybucket/subfolder/

Upload files all *.txt  that are in c:\folder\ to mybucket/subfolder/

put c:\folder\*.txt mybucket/subfolder/

Upload all files in c:\folder\ and its subfolders to mybucket using 3 parallel threads and multipart uploads. Throttle bandwidth to 50Kb/s. Make all uploaded files 'public-read' and set cache-control header to max-age=60.

put c:\folder\ mybucket -s -t:3 -mul -maxb:50 -cacl:public-read -meta:"cache-control:max-age=60"

Upload all files from c:\folder\ to mybucket and apply S3 server-side encryption for all uploaded files.

put c:\folder\ mybucket -e

Upload all files from c:\folder\ to mybucket. Before uploading, apply client-side local encryption.

put c:\folder\ mybucket -le

Upload non-empty files from c:\folder\ to mybucket and keep metadata and ACL of files that are overwritten.

put c:\folder\ mybucket -cond:"size <> 0" -keep

Upload only changed or new files from c:\folder\ to mybucket and keep metadata and ACL of files that are overwritten.
Changed files are files that have changed, that is they have different MD5 hash. New files are files that are not yet present on the S3 bucket.
Options -onlynew (upload only new files), -onlynewer (upload only files that have a newer timestamp) and -onlyexisting (re-upload only files that are already present on S3) are also available.

put c:\folder\ mybucket -onlydiff -keep

Upload only changed or new files from c:\folder\ to mybucket. Purge (=delete) S3 files in mybucket that are no longer present in c:\folder\. Keep output to console to minimum (-minoutput).

put c:\folder\ mybucket -s -onlydiff -purge -minoutput

Upload all *.jpg and *.gif files from c:\folder\ to mybucket, only if the are already existing on S3.

put c:\folder\ mybucket -include:*.jpg|*.gif -onlyexisting

Upload all *.jpg and *.gif files from c:\folder\ to mybucket, only if files are already existing on S3. Simulation (=preview) only, shows list of files that would be uploaded.

put c:\folder\ mybucket -include:*.jpg|*.gif -onlyexisting -sim

Move all *.jpg and *.gif files from c:\folder\ to mybucket.

put c:\folder\ mybucket -include:*.jpg|*.gif -move

Delete Objects / Copy Objects

Delete files in mybucket, including subfolders, that have  cache-control:max-age > 0 in the metadata.

del mybucket/* -s -cond:"extract_value(cache-control,'max-age') > 0"

Delete files in mybucket, including subfolders, that are empty.

del mybucket/* -s -cond:"size = 0"

Delete files in mybucket, including subfolders, with name starting with 'a'. Stop deleting files as soon as an error occurs.

del mybucket/* -s -cond:"name starts_with 'a'" -stoponerror

Delete previous versions of files in mybucket, which are older than 6 months, including subfolders.

del mybucket/* -s -onlyprev -cond:"s3_age_months>6"

Copy file mybucket/myfile.txt to mybucket/subfolder/myfile.txt and copy also metadata and ACL from source mybucket/myfile.txt to target mybucket/subfolder/myfile.txt.

copy mybucket/myfile.txt mybucket/subfolder/myfile.txt -keep

Metadata and Permissions (ACLs)

Set header cache-control:max-age=60 and x-amz-meta-test:yes to all files in mybucket/subfolder/.

setmeta mybucket/subfolder/* -meta:"cache-control:max-age=60|x-amz-meta-test:yes"

Set header x-amz-meta-test=yes to all files in mybucket/subfolder/ that have extension *.exe or *.rpt.

setmeta mybucket/subfolder/* -meta:"x-amz-meta-test=yes" -include:"*.exe|*.rpt"

Set server-side encryption header (= encrypt files) to all files in mybucket/subfolder/ that are larger than 5MB and do not have extension *.exe or *.rpt.

setmeta mybucket/* -e:+ -cond:"size_mb > 5" -exclude:"*.exe|*.rpt"

Get the metatdata of the file.txt and show ALL metadata headers.

getmeta file.txt

Get metatdata of the file.txt, but show only the cache-control header in the output.

getmeta file.txt -showmeta:"cache-control"

Get metatdata of the file.txt, but show only the cache-control header and the x-amz-server-side​-encryption header in the output.

getmeta file.txt -showmeta:"cache-control|x-amz-server-side​-encryption"

Set canned ACL 'private' to all jpg files in mybucket, including in subfolders of mybucket.

setacl mybucket/*.jpg -s -cacl:private

Set canned ACL 'public-read-write' to all txt files in mybucket, including in subfolders of mybucket

setacl mybucket/*.txt -s -cacl:public-read-write

Grant read access to emailAddress=xyz@amazon.com and emailAddress=abc@amazon.com to all files in mybucket.

setacl mybucket/* -grant-read:"emailAddress=xyz@amazon.com, emailAddress=abc@amazon.com"

Get ACL of object.txt and show AllUsers permissions in the output.

getacl object.txt -showacl:allusers

List all objects in 'mybucket', including in all subfolders, that are not 'public-read' ('public-read' means that owner gets FULL_CONTROL and the AllUsers group gets READ access). The ls command is used.

ls mybucket -s -cond:"s3_acl_is_public_read = false"

Other

Save S3 authorization in S3Express:

saveauth FASWQDSDSSSZXAS1SA AsFZEDy2BQfFSFzFfgKyyOF/xCaRcK4RMc