- Notifications
You must be signed in to change notification settings - Fork4
Gradle plugin that uploads and downloads S3 objects.
License
open-jumpco/s3-plugin
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
Simple Gradle plugin that uploads and downloads S3 objects. This is a fork ofthemygrocerydeals/gradle-s3-plugin, which no longer appears to beunder active development. It has been updated to work with Gradle version 6 and later and convert to pure Java.
Add the following to your build.gradle file:
plugins { id'io.jumpco.open.gradle.s3' version'1.4.3'}Seegradle plugin page for other versions.
When performing uploads you need to provides3.region as follows:
s3 { region='us-east-1'}By default, the S3 plugin searches for credentials in the same order astheAWS default credentials provider chain.
You can specify a profile by setting the projects3.profile ors3.awsAccessKeyId ands3.awsSecretAccessKey.
The provided access key and secret will take precedence.
s3 { profile='my-profile' awsAccessKeyId='12345678' awsSecretAccessKey='my-secret'}Setting the environment variablesAWS_ACCESS_KEY_ID andAWS_SECRET_ACCESS_KEY is one way to provide your S3credentials. See theAWS Docs fordetails on credentials.
Thes3.region property can optionally be set to define the Amazon EC2 region if one has not been set in theauthentication profile. It can also be used to override the default region set in the AWS credentials provider.
s3 { region='us-east-1'}Thes3.bucket property sets a default S3 bucket that is common to all tasks. This can be useful if all S3 tasksoperate against the same Amazon S3 bucket.
s3 { bucket='my.default.bucketname'}s3Uploads { jobName { key='target-filename' file='source-filename' }}s3Downloads { dlJob { keyPrefix='folder' destDir='targetDir' }}task myDownload(type:io.jumpco.open.gradle.s3.S3Download) { keyPrefix='folder' destDir='targetDir'}task myUpload(type:io.jumpco.open.gradle.s3.S3Upload) { key='target-filename' file='source-filename' compareContent=true}task myUploadFiles(type:io.jumpco.open.gradle.s3.S3Upload) { keyPrefix='target-folder' files= fileTree('my-folder').matching { include("**/*.zip") } compareContent=false}Note
Use the fully qualified name for the tasks:
io.jumpco.open.gradle.s3.S3Uploadio.jumpco.open.gradle.s3.S3Download
Job descriptions will result tasks with the name prefixed:
dlJobwill result in taskdlJobDownloadTaskjobNamewill result in taskjobNameUploadTask
Uploads one or more files to S3. This task has two modes of operation: single file upload and directory upload (including recursive upload of all child subdirectories). Properties that apply to both modes:
bucket- S3 bucket to use(optional, defaults to the projects3configuredbucket)awsAccessKeyId- AWS Access Key(optional, defaults to the projects3configuredawsAccessKeyId)awsSecretAccessKeyAWS Access Secret(optional, defaults to the projects3configuredawsSecretAccessKey)overwrite-(optional, default isfalse), iftruethe S3 object will be overwritten if it already exists.compareContent-(optional, default isfalse), iftrue, the s3 object will be downloaded and compared to thelocal content. If not the same then the file will be uploaded.skipError-(optional, default isfalse), iftruewill not fail on missing objects.batchSize-(optional, default is 100), Improve upload speed.
For a single file upload:
key- key of S3 object to createfile- path of file to be uploadedoverwrite-(optional, default isfalse), iftruethe S3 object will be overwritten if it already existscompareContent-(optional, default isfalse), iftrue, the s3 object will be downloaded and compared to thelocal content. If not the same then the file will be uploaded.
By defaultS3Upload does not overwrite the S3 object if it already exists. Setoverwrite totrue to upload thefile even if it exists.
For a directory upload:
keyPrefix- root S3 prefix under which to create the uploaded contentssourceDir- local directory containing the contents to be uploaded
A directory upload will always overwrite existing content if it already exists under the specified S3 prefix.
For a FileCollection upload:
keyPrefix- root S3 prefix under which to create the uploaded contentsfiles- aFileCollectionthat can be created usingfiles()orfileTree()
Downloads one or more S3 objects. This task has two modes of operation: single file download and recursive download.Properties that apply to both modes:
bucket- S3 bucket to use(optional, defaults to the projects3configuredbucket)awsAccessKeyId- AWS Access Key(optional, defaults to the projects3configuredawsAccessKeyId)awsSecretAccessKeyAWS Access Secret(optional, defaults to the projects3configuredawsSecretAccessKey)
For a single file download:
key- key of S3 object to downloadfile- local path of file to save the download to
For a recursive download:
keyPrefix- S3 prefix of objects to downloaddestDir- local directory to download objects to
Note:
Recursive downloads create a sparse directory tree containing the fullkeyPrefix underdestDir. So with an S3 bucketcontaining the object keys:
top/foo/bartop/READMEa recursive download:
s3Downloads { downloadRecursive { keyPrefix='top/foo/' destDir='local-dir' }}results in this local tree:
local-dir/└── top └── foo └── barSo only files undertop/foo are downloaded, but their full S3 paths are appended to thedestDir. This is differentfrom the behavior of the aws cliaws s3 cp --recursive command which prunes the root of the downloaded objects. Usethe flexibleGradle Copy task to prune the treeafter downloading it.
For example:
def localTree='path/to/some/location'task downloadRecursive(type:io.jumpco.open.gradle.s3.S3Download) { bucket='s3-bucket-name' keyPrefix="${localTree}" destDir="${buildDir}/download-root"}// prune and re-root the downloaded tree, removing the keyPrefixtask copyDownload(type:Copy,dependsOn: downloadRecursive) { from"${buildDir}/download-root/${localTree}" into"${buildDir}/pruned-tree"}
Downloads report percentage progress at the gradle INFO level. Run gradle with the-i option to see download progress.
About
Gradle plugin that uploads and downloads S3 objects.
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Languages
- Java88.5%
- Groovy11.5%