Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Gradle plugin that uploads and downloads S3 objects.

License

NotificationsYou must be signed in to change notification settings

open-jumpco/s3-plugin

 
 

Repository files navigation

InstallMIT License

Simple Gradle plugin that uploads and downloads S3 objects. This is a fork ofthemygrocerydeals/gradle-s3-plugin, which no longer appears to beunder active development. It has been updated to work with Gradle version 6 and later and convert to pure Java.

Setup

Add the following to your build.gradle file:

plugins {    id'io.jumpco.open.gradle.s3' version'1.4.3'}

Versioning

Seegradle plugin page for other versions.

Usage

AWS Configuration

When performing uploads you need to provides3.region as follows:

s3 {    region='us-east-1'}

Authentication

By default, the S3 plugin searches for credentials in the same order astheAWS default credentials provider chain.

You can specify a profile by setting the projects3.profile ors3.awsAccessKeyId ands3.awsSecretAccessKey.

The provided access key and secret will take precedence.

s3 {    profile='my-profile'    awsAccessKeyId='12345678'    awsSecretAccessKey='my-secret'}

Setting the environment variablesAWS_ACCESS_KEY_ID andAWS_SECRET_ACCESS_KEY is one way to provide your S3credentials. See theAWS Docs fordetails on credentials.

Amazon EC2 Region

Thes3.region property can optionally be set to define the Amazon EC2 region if one has not been set in theauthentication profile. It can also be used to override the default region set in the AWS credentials provider.

s3 {    region='us-east-1'}

Default S3 Bucket

Thes3.bucket property sets a default S3 bucket that is common to all tasks. This can be useful if all S3 tasksoperate against the same Amazon S3 bucket.

s3 {    bucket='my.default.bucketname'}

Sample Tasks

s3Uploads {    jobName {        key='target-filename'        file='source-filename'    }}s3Downloads {    dlJob {        keyPrefix='folder'        destDir='targetDir'    }}task myDownload(type:io.jumpco.open.gradle.s3.S3Download) {    keyPrefix='folder'    destDir='targetDir'}task myUpload(type:io.jumpco.open.gradle.s3.S3Upload) {    key='target-filename'    file='source-filename'    compareContent=true}task myUploadFiles(type:io.jumpco.open.gradle.s3.S3Upload) {  keyPrefix='target-folder'  files= fileTree('my-folder').matching {    include("**/*.zip")  }  compareContent=false}

Note

Use the fully qualified name for the tasks:

  • io.jumpco.open.gradle.s3.S3Upload
  • io.jumpco.open.gradle.s3.S3Download

Job descriptions will result tasks with the name prefixed:

  • dlJob will result in taskdlJobDownloadTask
  • jobName will result in taskjobNameUploadTask

s3Uploads

Uploads one or more files to S3. This task has two modes of operation: single file upload and directory upload (including recursive upload of all child subdirectories). Properties that apply to both modes:

  • bucket - S3 bucket to use(optional, defaults to the projects3 configuredbucket)
  • awsAccessKeyId - AWS Access Key(optional, defaults to the projects3 configuredawsAccessKeyId)
  • awsSecretAccessKey AWS Access Secret(optional, defaults to the projects3 configuredawsSecretAccessKey)
  • overwrite -(optional, default isfalse), iftrue the S3 object will be overwritten if it already exists.
  • compareContent -(optional, default isfalse), iftrue, the s3 object will be downloaded and compared to thelocal content. If not the same then the file will be uploaded.
  • skipError -(optional, default isfalse), iftrue will not fail on missing objects.
  • batchSize -(optional, default is 100), Improve upload speed.

For a single file upload:

  • key - key of S3 object to create
  • file - path of file to be uploaded
  • overwrite -(optional, default isfalse), iftrue the S3 object will be overwritten if it already exists
  • compareContent -(optional, default isfalse), iftrue, the s3 object will be downloaded and compared to thelocal content. If not the same then the file will be uploaded.

By defaultS3Upload does not overwrite the S3 object if it already exists. Setoverwrite totrue to upload thefile even if it exists.

For a directory upload:

  • keyPrefix - root S3 prefix under which to create the uploaded contents
  • sourceDir - local directory containing the contents to be uploaded

A directory upload will always overwrite existing content if it already exists under the specified S3 prefix.

For a FileCollection upload:

  • keyPrefix - root S3 prefix under which to create the uploaded contents
  • files - aFileCollection that can be created usingfiles() orfileTree()

s3Downloads

Downloads one or more S3 objects. This task has two modes of operation: single file download and recursive download.Properties that apply to both modes:

  • bucket - S3 bucket to use(optional, defaults to the projects3 configuredbucket)
  • awsAccessKeyId - AWS Access Key(optional, defaults to the projects3 configuredawsAccessKeyId)
  • awsSecretAccessKey AWS Access Secret(optional, defaults to the projects3 configuredawsSecretAccessKey)

For a single file download:

  • key - key of S3 object to download
  • file - local path of file to save the download to

For a recursive download:

  • keyPrefix - S3 prefix of objects to download
  • destDir - local directory to download objects to

Note:

Recursive downloads create a sparse directory tree containing the fullkeyPrefix underdestDir. So with an S3 bucketcontaining the object keys:

top/foo/bartop/README

a recursive download:

s3Downloads {    downloadRecursive {        keyPrefix='top/foo/'        destDir='local-dir'    }}

results in this local tree:

local-dir/└── top    └── foo        └── bar

So only files undertop/foo are downloaded, but their full S3 paths are appended to thedestDir. This is differentfrom the behavior of the aws cliaws s3 cp --recursive command which prunes the root of the downloaded objects. Usethe flexibleGradle Copy task to prune the treeafter downloading it.

For example:

def localTree='path/to/some/location'task downloadRecursive(type:io.jumpco.open.gradle.s3.S3Download) {    bucket='s3-bucket-name'    keyPrefix="${localTree}"    destDir="${buildDir}/download-root"}// prune and re-root the downloaded tree, removing the keyPrefixtask copyDownload(type:Copy,dependsOn: downloadRecursive) {    from"${buildDir}/download-root/${localTree}"    into"${buildDir}/pruned-tree"}

Progress Reporting

Downloads report percentage progress at the gradle INFO level. Run gradle with the-i option to see download progress.

License

MIT License

About

Gradle plugin that uploads and downloads S3 objects.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Java88.5%
  • Groovy11.5%

[8]ページ先頭

©2009-2025 Movatter.jp