awsS3Upload¶
Uploads a specified file or directory into a given AWS S3 Bucket
Description¶
Uploads a specified file or directory as S3 Objects into a given AWS S3 Bucket. In case a file is uploaded that is already contained in the S3 bucket, it will be overwritten with the latest version.
Usage¶
We recommend to define values of step parameters via .pipeline/config.yml file.
In this case, calling the step is essentially reduced to defining the step name.
Calling the step can be done either in an orchestrator specific way (e.g. via a Jenkins library step) or on the command line.
library('piper-lib-os')
awsS3Upload script: this
piper awsS3Upload
Prerequisites¶
- Before you can use the step awsS3Upload, you must have an Amazon account. See How do I create and activate a new AWS account? for details.
- You will need AWS access keys for your S3 Bucket. Access keys consist of an access key ID and secret access key, which are used to sign programmatic requests that you make to AWS. You can create them by using the AWS Management Console.
- The access keys must allow the action "s3:PutObject" for the specified S3 Bucket
Set up the AWS Credentials¶
To make your AWS credentials available to the jenkins library, store them as Jenkins credentials of type "Secret Text". The "Secret Text" must be in JSON format and contain the "access_key_id", "secret_access_key", "bucket" as well as the "region".
For Example:
{
"access_key_id": "FJNAKNCLAVLRNBLAVVBK",
"bucket": "vro-artloarj-ltnl-nnbv-ibnh-lbnlsnblltbn",
"secret_access_key": "123467895896646438486316436kmdlcvreanvjk",
"region": "eu-central-1"
}
If the JSON string contains additional information, this is not a problem. These are automatically detected and skipped.
About Files/Directories to Upload¶
With the step awsS3Upload you can upload single files as well as whole directories into your S3 bucket. File formats do not matter and directory structures are preserved.
Note: File paths must be specified in UNIX format. So the used path separator must be "/".
Parameters¶
Overview - Step¶
Name | Mandatory | Additional information |
---|---|---|
filePath | yes | |
jsonCredentialsAWS | (yes) | pass via ENV or Jenkins credentials (awsCredentialsId ) |
script | (yes) | reference to Jenkins main pipeline script |
verbose | no | activates debug output |
Overview - Execution Environment¶
Orchestrator-specific only
These parameters are relevant for orchestrator usage and not considered when using the command line option.
Name | Mandatory | Additional information |
---|---|---|
Details¶
filePath¶
Name/Path of the file which should be uploaded
Scope | Details |
---|---|
Aliases | - |
Type | string |
Mandatory | yes |
Default | $PIPER_filePath (if set) |
Secret | no |
Configuration scope |
|
Resource references | commonPipelineEnvironment: reference to: mtarFilePath |
jsonCredentialsAWS¶
JSON String Credentials to access AWS S3 Bucket
Scope | Details |
---|---|
Aliases | - |
Type | string |
Mandatory | yes |
Default | $PIPER_jsonCredentialsAWS (if set) |
Secret | yes |
Configuration scope |
|
Resource references | Jenkins credential id: id: awsCredentialsId |
script¶
Jenkins-specific: Used for proper environment setup.
The common script environment of the Jenkinsfile running. Typically the reference to the script calling the pipeline step is provided with the this
parameter, as in script: this
. This allows the function to access the commonPipelineEnvironment
for retrieving, e.g. configuration parameters.
Scope | Details |
---|---|
Aliases | - |
Type | Jenkins Script |
Mandatory | yes |
Default | |
Secret | no |
Configuration scope |
|
Resource references | none |
verbose¶
verbose output
Scope | Details |
---|---|
Aliases | - |
Type | bool |
Mandatory | no |
Default | false |
Possible values | - true - false |
Secret | no |
Configuration scope |
|
Resource references | none |
awsCredentialsId¶
Jenkins-specific: Used for proper environment setup. See using credentials for details.
Jenkins 'Secret Text' credentials ID containing the JSON file to authenticate to the AWS S3 Bucket
Scope | Details |
---|---|
Aliases | - |
Type | string |
Configuration scope |
|
Example¶
awsS3Upload(
script: this,
awsCredentialsId: "AWS_Credentials",
filePath: "test.txt"
)