Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

S3 bucket permissions#750

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.

Already on GitHub?Sign in to your account

Merged
NimRegev merged 2 commits intomasterfroms3-bucket-permissions
Jul 4, 2023
Merged
Show file tree
Hide file tree
Changes fromall commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion_data/home-content.yml
View file
Open in desktop
Original file line numberDiff line numberDiff line change
Expand Up@@ -46,7 +46,9 @@
- title: Docker Registries
localurl: /docs/integrations/docker-registries/
- title: Secret Storage
localurl: /docs/integrations/secret-storage/
localurl: /docs/integrations/secret-storage/
- title: Cloud Storage
localurl: /docs/integrations/cloud-storage/
- title: Helm
localurl: /docs/integrations/helm/
- title: Argo CD
Expand Down
2 changes: 2 additions & 0 deletions_data/nav.yml
View file
Open in desktop
Original file line numberDiff line numberDiff line change
Expand Up@@ -254,6 +254,8 @@
url: "/digital-ocean-container-registry"
- title: Other Registries
url: "/other-registries"
- title: Cloud Storage
url: "/cloud-storage"
- title: Secret Storage
url: "/secret-storage"
- title: Hashicorp Vault
Expand Down
56 changes: 39 additions & 17 deletions_docs/integrations/amazon-web-services.md
View file
Open in desktop
Original file line numberDiff line numberDiff line change
Expand Up@@ -7,22 +7,29 @@ toc: true

Codefresh has native support for AWS in the following areas:

- [Connecting to Amazon registries]({{site.baseurl}}/docs/integrations/docker-registries/amazon-ec2-container-registry/)
- [Deploying to Amazon EKS]({{site.baseurl}}/docs/integrations/kubernetes/#adding-eks-cluster)
- [Using Amazon S3 for Test reports]({{site.baseurl}}/docs/testing/test-reports/#connecting-an-s3-bucket)
- [Using Amazon S3 for Helm charts]({{site.baseurl}}/docs/deployments/helm/helm-charts-and-repositories/)
- [Amazon container registries: ECR](#amazon-container-registries)
- [Amazon Kubernetes clusters: EKS](amazon-kubernetes-clusters)
- Amazon S3 buckets:
- [For Test reports](#amazon-s3-bucket-for-test-reports)
- [For Helm charts](#amazon-s3-bucket-for-helm-charts)

See also [other Amazon deployments](#other-amazon-deployments).

##UsingAmazonECR
## AmazonContainer Registries

Amazon Container Registries are fully compliant with the Docker registry API that Codefresh follows. Follow the instruction under [Amazon EC2 Container Registry]({{site.baseurl}}/docs/integrations/docker-registries/amazon-ec2-container-registry/) to connect.
Amazon Container Registries are fully compliant with the Docker registry API that Codefresh follows.

Codefresh supports integration with Amazon ECR.
To connect, follow the instructions described in [Amazon EC2 Container Registry]({{site.baseurl}}/docs/integrations/docker-registries/amazon-ec2-container-registry/).

Once the registry is added, you can use the [standard push step]({{site.baseurl}}/docs/pipelines/steps/push/) in your pipelines. See [working with Docker registries]({{site.baseurl}}/docs/ci-cd-guides/working-with-docker-registries/) for more information.

##Deploying toAmazon Kubernetes
## Amazon Kubernetes clusters

Codefresh has native support for connecting an EKS cluster in the [cluster configuration screen]({{site.baseurl}}/docs/integrations/kubernetes/#connect-a-kubernetes-cluster).
Codefresh has native support for connecting an EKS cluster through the integration options for Kubernetes in Pipeline Integrations.
See [Adding an EKS cluster]({{site.baseurl}}/docs/integrations/kubernetes/#adding-eks-cluster) in [Kubernetes pipeline integrations]({{site.baseurl}}/docs/integrations/kubernetes/).

<!-- ask Kostis which is correct?
{%
include image.html
lightbox="true"
Expand All@@ -32,12 +39,24 @@ alt="Connecting an Amazon cluster"
caption="Connecting a Amazon cluster"
max-width="40%"
%}

-->
{%
include image.html
lightbox="true"
file="/images/integrations/kubernetes/eks-cluster-option.png"
url="/images/integrations/kubernetes/eks-cluster-option.png"
alt="Connecting an Amazon EKS cluster"
caption="Connecting a Amazon EKS cluster"
max-width="40%"
%}
Once the cluster is connected, you can use any of the [available deployment options]({{site.baseurl}}/docs/deployments/kubernetes/) for Kubernetes clusters. You also get access to all other Kubernetes dashboards such as the [cluster dashboard]({{site.baseurl}}/docs/deployments/kubernetes/manage-kubernetes/) and the [environment dashboard]({{site.baseurl}}/docs/deployments/kubernetes/environment-dashboard/).

## Storing test reports in Amazon S3 bucket
## Amazon S3 bucket for test reports

Codefresh has native support for storing test reports in different storage buckets, including Amazon's S3 storage bucket.
You can connect an Amazon S3 bucket storage account to Codefresh through the Cloud Storage options in Pipeline Integrations.


Codefresh has native support for test reports. You can store the reports on Amazon S3.

{% include
image.html
Expand All@@ -49,11 +68,15 @@ caption="Amazon cloud storage"
max-width="60%"
%}

See the full documentation for [test reports]({{site.baseurl}}/docs/testing/test-reports/).
For detailed instructions, to set up an integration with your S3 storage account in Amazon in Codefresh, see [Cloud storage integrations for pipelines]({{site.baseurl}}/docs/integrations/cloud-storage/), and to create and store test reports through Codefresh pipelines, see [Creating test reports]({{site.baseurl}}/docs/testing/test-reports/).

## Amazon S3 bucket for Helm charts

You can also connect an Amazon S3 bucket as a Helm repository through the Helm Repository integration options in Pipeline Integrations.

## Using Amazon S3 for storing Helm charts
For detailed instructions, see [Helm charts and repositories]({{site.baseurl}}/docs/deployments/helm/helm-charts-and-repositories/).
Once you connect your Helm repository, you can use it any [Codefresh pipeline with the Helm step]({{site.baseurl}}/docs/deployments/helm/using-helm-in-codefresh-pipeline/).

You can connect an Amazon S3 bucket as a Helm repository in the [integrations screen]({{site.baseurl}}/docs/deployments/helm/helm-charts-and-repositories/).

{% include
image.html
Expand All@@ -65,12 +88,11 @@ caption="Using Amazon for Helm charts"
max-width="80%"
%}

Once you connect your Helm repository you can use it any [Codefresh pipeline with the Helm step]({{site.baseurl}}/docs/deployments/helm/using-helm-in-codefresh-pipeline/).


##Traditional Amazon deployments
##Other Amazon deployments

For any other Amazon deployment you can use the [Amazon CLI from a Docker image](https://hub.docker.com/r/amazon/aws-cli){:target="\_blank"} in a [freestyle step]({{site.baseurl}}/docs/pipelines/steps/freestyle/).
For any other Amazon deployment, you can use the [Amazon CLI from a Docker image](https://hub.docker.com/r/amazon/aws-cli){:target="\_blank"} in a [freestyle step]({{site.baseurl}}/docs/pipelines/steps/freestyle/).

`YAML`
{% highlight yaml %}
Expand Down
2 changes: 1 addition & 1 deletion_docs/integrations/argocd.md
View file
Open in desktop
Original file line numberDiff line numberDiff line change
Expand Up@@ -5,7 +5,7 @@ group: integrations
toc: true
---

>Important:
>**IMPORTANT**:
We are planning to deprecate the ArgoCD agent for Codefresh pipelines. It has now been replaced with the GitOps runtime, that offers a superset of the functionality of the agent, and is also better integrated
with the Codefresh dashboards.

Expand Down
201 changes: 201 additions & 0 deletions_docs/integrations/cloud-storage.md
View file
Open in desktop
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,201 @@
---
title: "Cloud Storage pipeline integrations"
description: "How to use Codefresh with Cloud Storage providers"
group: integrations
toc: true
---

Codefresh integrations with cloud storage providers provide a convenient solution for storing test reports.
With Codefresh, you can easily configure your pipelines to store test reports in your preferred Cloud Storage provider, such as Amazon S3, Google Cloud Storage, Azure, and MinIO.

For every cloud storage provider, you need to first create a storage bucket in your storage provider account, connect the account with Codefresh to create an integration, and configure your pipelines to [create and upload test reports]({{site.baseurl}}/docs/testing/test-reports/).

## Connecting your storage account to Codefresh

When you connect your storage provider account to Codefresh, Codefresh creates subfolders in the storage bucket for every build, with the build IDs as folder names. Test reports generated for a build are uploaded to the respective folder. The same bucket can store test reports from multiple pipeline builds.

1. In the Codefresh UI, on the toolbar, click the Settings icon, and then from the sidebar select **Pipeline Integrations**.
1. Scroll down to **Cloud Storage**, and click **Configure**.


{% include
image.html
lightbox="true"
file="/images/pipeline/test-reports/cloud-storage-integrations.png"
url="/images/pipeline/test-reports/cloud-storage-integrations.png"
alt="Cloud storage Integrations"
caption="Cloud storage Integrations"
max-width="80%"
%}

{:start="3"}
1. Click **Add Cloud Storage**, and select your cloud provider for test report storage.
1. Define settings for your cloud storage provider, as described in the sections that follow.

## Connecting a Google bucket

**In Google**

1. Create a bucket either from the Google cloud console or the `gsutil` command line tool.
See the [official documentation](https://cloud.google.com/storage/docs/creating-buckets#storage-create-bucket-console){:target="\_blank"} for the exact details.

**In Codefresh**
1. [Connect your storage account](#connecting-your-storage-account) and select **Google Cloud Storage**.

{% include
image.html
lightbox="true"
file="/images/pipeline/test-reports/cloud-storage-google.png"
url="/images/pipeline/test-reports/cloud-storage-google.png"
alt="Google cloud storage"
caption="Google cloud storage"
max-width="80%"
%}

{:start="2"}
1. Define the settings:
* Select **OAuth2** as the connection method, which is the easiest way.
* Enter an arbitrary name for your integration.
* Select **Allow access to read and write into storage** as Codefresh needs to both write to and read from the bucket.
1. Click **Save**.
1. When Codefresh asks for extra permissions from your Google account, accept the permissions.

The integration is ready. You will use the name of the integration as an environment variable in your Codefresh pipeline.

> **NOTE**:
An alternative authentication method is to use **JSON Config** with a [Google service account key](https://console.cloud.google.com/apis/credentials/serviceaccountkey){:target="\_blank"}.
In that case, download the JSON file locally and paste its contents in the **JSON config** field.
For more information, see the [official documentation](https://cloud.google.com/iam/docs/creating-managing-service-account-keys){:target="\_blank"}.

## Connecting an Amazon S3 bucket

**Create an S3 bucket in AWS (Amazon Web Services)**

1. Create an S3 bucket in AWS.
See the [official documentation](https://docs.aws.amazon.com/quickstarts/latest/s3backup/step-1-create-bucket.html){:target="\_blank"}, or use the [AWS CLI](https://docs.aws.amazon.com/cli/latest/reference/s3api/create-bucket.html){:target="\_blank"}.
1. Define the necessary IAM (Identity and Access Management) policy settings.
Here's an example IAM policy that you can use as a reference:
```
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::cf-backup*"
]
},
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
],
"Resource": [
"arn:aws:s3:::cf-backup*/*"
]
}
]
}
```

1. Note down the **Access** and **Secret** keys generated when you created the S3 bucket.

**Define S3 settings in Codefresh**
1. Select **Amazon Cloud Storage** as your [Cloud Storage provider](#connecting-your-storage-account).
1. Define the settings:
* Enter an arbitrary name for your integration.
* Paste the **AWS Access Key ID** and **AWS Secret Access Key**.
1. Click **Save**.

{% include
image.html
lightbox="true"
file="/images/pipeline/test-reports/cloud-storage-s3.png"
url="/images/pipeline/test-reports/cloud-storage-s3.png"
alt="S3 cloud storage"
caption="S3 cloud storage"
max-width="80%"
%}

After setting up and verifying the S3 bucket integration, you can use:
* The name of the integration as an environment variable in your Codefresh pipeline.
* Any [external secrets that you have defined]({{site.baseurl}}/docs/integrations/secret-storage/) (such as Kubernetes secrets), as values, by clicking on the lock icon that appears next to field:
* If you have already specified the resource field during secret definition, just enter the name of the secret directly in the text field, for example, `my-secret-key`.
* If you didn't include a resource name during secret creation, enter the full name in the field, for example, `my-secret-resource@my-secret-key`.

## Connecting Azure Blob/File storage

**Create a storage account in Azure**

1. For Azure, create a storage account.
See the [official documentation](https://docs.microsoft.com/en-us/azure/storage/common/storage-account-create){:target="\_blank"}.
1. Find one of the [two access keys](https://docs.microsoft.com/en-us/azure/storage/common/storage-account-keys-manage){:target="\_blank"} already created.
1. Note down the **Account Name** and **Access key for the account**.

**Define Azure settings in Codefresh**
1. Select **Azure File/Blob Storage** as your [Cloud Storage provider](#connecting-your-storage-account).
1. Define the settings:
* Enter an arbitrary name for your integration.
* Paste the **Azure Account Name** and **Azure Account Key**.
1. Click **Save**.


{% include
image.html
lightbox="true"
file="/images/pipeline/test-reports/cloud-storage-azure.png"
url="/images/pipeline/test-reports/cloud-storage-azure.png"
alt="Azure cloud storage"
caption="Azure cloud storage"
max-width="60%"
%}

After setting up and verifying the Azure File/Blob integration, you can use:
* The name of the integration as an environment variable in your Codefresh pipeline.
* Any [external secrets that you have defined]({{site.baseurl}}/docs/integrations/secret-storage/) (such as Kubernetes secrets), as values, by clicking on the lock icon that appears next to field:
* If you have already specified the resource field during secret definition, just enter the name of the secret directly in the text field, for example, `my-secret-key`.
* If you didn't include a resource name during secret creation, enter the full name in the field, for example, `my-secret-resource@my-secret-key`.


## Connecting MinIO storage

**Create a storage account in MinIO**
1. Configure the MinIO server.
See the [official documentation](https://docs.min.io/docs/minio-quickstart-guide.html){:target="\_blank"}.
1. Copy the Access and Secret keys.

**Set up a MinIO integration in Codefresh**

1. Select **MinIO Cloud Storage** as your [Cloud Storage provider](#connecting-your-storage-account).
1. Define the settings:
* **NAME**: The name of the MinIO storage. Any name that is meaningful to you.
* **ENDPOINT**: The URL to the storage service object.
* **PORT**: Optional. The TCP/IP port number. If not defined, defaults to port `80` for HTTP, and `443` for HTTPS.
* **Minio Access Key**: The ID that uniquely identifies your account, similar to a user ID.
* **Secret Minio Key**: The password of your account.
* **Use SSL**: Select to enable secure HTTPS access. Not selected by default.
1. Click **Save**.

{% include
image.html
lightbox="true"
file="/images/pipeline/test-reports/cloud-storage-minio.png"
url="/images/pipeline/test-reports/cloud-storage-minio.png"
alt="MinIO cloud storage"
caption="MinIO cloud storage"
max-width="60%"
%}


## Related articles
[Amazon Web Services (AWS) pipeline integration]({{site.baseurl}}/docs/integrations/amazon-web-services/)
[Microsoft Azure pipeline integration]({{site.baseurl}}/docs/integrations/microsoft-azure/)
[Google Cloud pipeline integration]({{site.baseurl}}/docs/integrations/google-cloud/)
[Creating test reports]({{site.baseurl}}/docs/testing/test-reports/)
[Codefresh YAML for pipeline definitions]({{site.baseurl}}/docs/pipelines/what-is-the-codefresh-yaml/)
[Steps in pipelines]({{site.baseurl}}/docs/pipelines/steps/)
View file
Open in desktop
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
---
title: "AmazonEC2 Container Registry"
title: "AmazonECR Container Registry"
description: "Use the Amazon Docker Registry for pipeline integrations"
group: integrations
sub_group: docker-registries
Expand DownExpand Up@@ -36,15 +36,17 @@ Codefresh makes sure to automatically refresh the AWS token for you.

For more information on how to obtain the needed tokens, read the [AWS documentation](http://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys){:target="_blank"}.

>Note:
>**NOTE**:
You must have an active registry set up in AWS.<br /><br />
Amazon ECR push/pull operations are supported with two permission options: user-based and resource-based.


* User-based permissions: User account must apply `AmazonEC2ContainerRegistryPowerUser` policy (or custom based on that policy).
* Identity-based policies
User account must apply `AmazonEC2ContainerRegistryPowerUser` policy (or custom based on that policy).
For more information and examples, click [here](http://docs.aws.amazon.com/AmazonECR/latest/userguide/ecr_managed_policies.html){:target="_blank"}.
* Resource-based permissions: Users with resource-based permissions must be allowed to call `ecr:GetAuthorizationToken` before they can authenticate to a registry, and push or pull any images from any Amazon ECR repository, than you need provide push/pull permissions to specific registry.
For more information and examples, click [here](http://docs.aws.amazon.com/AmazonECR/latest/userguide/RepositoryPolicies.html){:target="_blank"}.
* Resource-based policy
Users with resource-based policies must be allowed to call `ecr:GetAuthorizationToken` before they can authenticate to a registry, and push or pull any images from any Amazon ECR repository, than you need provide push/pull permissions to specific registry.
For more information and examples, click [here](http://docs.aws.amazon.com/AmazonECR/latest/userguide/RepositoryPolicies.html){:target="_blank"}.


## Set up ECR integration for service account
Expand DownExpand Up@@ -168,7 +170,8 @@ max-width="40%"
3. Click **Promote**.


>It is possible to change the image name if you want, but make sure that the new name exists as a repository in ECR.
>**NOTE**:
It is possible to change the image name if you want, but make sure that the new name exists as a repository in ECR.


## Related articles
Expand Down
7 changes: 5 additions & 2 deletions_docs/integrations/google-cloud.md
View file
Open in desktop
Original file line numberDiff line numberDiff line change
Expand Up@@ -52,7 +52,9 @@ You also get access to all other Kubernetes dashboards such as the [cluster dash

## Storing test reports in Google Cloud storage

Codefresh has native support for test reports. You can store the reports on Google Cloud storage.
Codefresh has native support for storing test reports in different storage buckets, including Google Cloud storage.
You can connect your Google Cloud storage account to Codefresh through the Cloud Storage options in Pipeline Integrations.


{% include
image.html
Expand All@@ -64,7 +66,8 @@ caption="Google cloud storage"
max-width="50%"
%}

See the full documentation for [test reports]({{site.baseurl}}/docs/testing/test-reports/).
For detailed instructions, to set up an integration with your Google Cloud storage account in Codefresh, see [Cloud storage integrations for pipelines]({{site.baseurl}}/docs/integrations/cloud-storage/), and to create and store test reports through Codefresh pipelines, see [Creating test reports]({{site.baseurl}}/docs/testing/test-reports/).


## Using Google Storage for storing Helm charts

Expand Down
Loading

[8]ページ先頭

©2009-2025 Movatter.jp