You signed in with another tab or window.Reload to refresh your session.You signed out in another tab or window.Reload to refresh your session.You switched accounts on another tab or window.Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: _docs/kb/articles/ensure-cleanup-when-pipelines-stops.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -21,7 +21,7 @@ The parallel step always runs at the end of the pipeline, regardless of its buil
21
21
22
22
##How to
23
23
24
-
In a sequential pipeline, which is the default execution mode for pipelines, manually terminating the pipeline, also terminates the execution of subsequent steps, including any pipeline hooks set to run on success or failure.
24
+
In a sequential pipeline, which is the default execution mode for pipelines, manually terminating the pipeline also terminates the execution of subsequent steps, including any pipeline hooks set to run on success or failure.
25
25
26
26
You can circumvent this behavior by inserting a parallel step within the pipeline.
This article describes useful API calls to retrieve information on a pipeline build and its steps. It also describes the fields included as part of the response to those calls - to be used as-is or how to infer new metrics from the values.
15
15
16
-
This document summarizes some API calls that can be useful to get information about abuildand its steps.
16
+
##Calls for pipelinebuildinformation
17
17
18
-
It also describes the fields included as part of the response to those calls to be used as-is or to infer new metrics out of those values.
18
+
We will explore three main calls to programmatically get information about the build:
19
+
* General build information
20
+
`codefresh get build <BUILD_ID>` (could be changed to an API call, since that one has more information)
This GET call also includes resource-consumption metrics. As they are not Prometheus-based, they are not accurate for specific step types, such as the build-step.
19
26
20
-
##Details
21
27
22
-
We’ll explore three main calls to programmatically get information about the build:
28
+
##Usage script for CLI/API calls
23
29
24
-
* General build information:`codefresh get build <BUILD_ID>` (could be changed to an API call, since that one has more information)
25
-
* Build and Steps information:`GET /api/workflow/<BUILD_ID/context-revision`
26
-
* Logs:`GET https://g.codefresh.io/api/progress/<PROGRESS_ID>` (this one also has resource-consumption metrics, but they are not Prometheus-based, thus, for some steps, such as the build-step, they are not accurate)
30
+
The following script is a suggestion on how to use the different CLI and API calls available.
27
31
28
-
Thefollowing scriptisa suggestion on how to use the different CLI and API calls available
32
+
Theideaisto run this asynchronously. For example, using a cron-trigger in a pipeline in Codefresh to execute the pipeline daily.
29
33
30
-
The idea is to run this in an asynchronous fashion. For example, using a cron-trigger in a pipeline in Codefresh, to execute the pipeline every day.
31
-
32
-
The same calls can be used if you want to incorporate the process of pushing metrics into your monitoring platform, as part of the build itself (e.g., in a hook, at the end of the pipeline)
34
+
You can also use the same calls to incorporate the process of pushing metrics into your monitoring platform, as part of the build itself. For example, in a hook, at the end of the pipeline.
33
35
34
36
```shell
35
37
#!/bin/bash
@@ -66,7 +68,8 @@ do
66
68
done
67
69
```
68
70
69
-
This script will generate a JSON file per build, with the following structure:
71
+
###Script response
72
+
This script generates a JSON file per build with the following structure:
70
73
71
74
```json
72
75
{
@@ -86,32 +89,34 @@ This script will generate a JSON file per build, with the following structure:
86
89
}
87
90
```
88
91
89
-
Description of each field:
90
-
91
-
*`created`: timestamp indicating when the build was created (“submitted“). Example:`2021-05-17T21:31:35.779Z`
92
-
*`started`: timestamp indicating when the build was started (the Initializing Process started). Example:`2021-05-17T21:31:47.742Z`
93
-
*`finished`: timestamp indicating when the build was started. Example:`2021-05-17T21:32:21.435Z`
94
-
*`totalTime`: duration (in HH:MM:SS) of the build from`created` to`finished`
95
-
*`buildTime`: duration (in HH:MM:SS) of the build from`started` to`finished`
96
-
*`status`: status of the build (error, success, etc)
97
-
*`pipeline-name`: Full name of the pipeline
98
-
*`repository`: Name of the repo associated with the build execution. This will only be different to`/` if a git-trigger is used to trigger a build.
99
-
*`webhook`: boolean indicating if the build was triggered by a webhook (sent by a git-provider, for example) or not.
100
-
*`pipeline-Id`: the ID of the pipeline for the build
101
-
*`stateYaml`: Object representing the last State of the build. It contains detailed information about the build and its steps. When the build doesn’t have a State YAML (e.g., the build was terminated before it could start), the value of this field will be an empty Object:`{}`.
92
+
The table describes the fields in the JSON response.
|`created`| The timestamp indicating when the build was created (submitted). Example:`2021-05-17T21:31:35.779Z`.|
98
+
|`started`| The timestamp indicating when the build started execution, that is the start of the Initializing Process. Example:`2021-05-17T21:31:47.742Z`.|
99
+
|`finished`| The timestamp indicating when the build completed execution. Example:`2021-05-17T21:32:21.435Z`.|
100
+
|`totalTime`| The duration of the build, in HH:MM:SS, from`created` to`finished`.|
101
+
|`buildTime`| The duration of the build, in HH:MM:SS, from`started` to`finished`.|
102
+
|`status`| The status of the build. See[Viewing status for pipeline builds]({{site.baseurl}}/docs/pipelines/monitoring-pipelines/#viewing-status-for-pipeline-builds).|
103
+
|`pipeline-name`| The full name of the pipeline .|
104
+
|`repository`| The name of the repo associated with the build execution. This will only be different to`/` if a git-trigger is used to trigger a build.|
105
+
|`webhook`| The boolean indication indicating if the build was triggered by a webhook, sent by a git-provider, for example or not.|
106
+
|`pipeline-Id`| The ID of the pipeline for the build run.|
107
+
|`stateYaml`| The object representing the last state of the build, containing detailed information about the build and its steps.<br> When the build doesn’t have a State YAML, as when it was terminated before it could start, the value of this field will be an empty Object:`{}`.|
104
108
105
-
The State YAML of the build is represented in the`stateYaml` field of each JSON file.
106
109
107
-
It’s composed of several fields, but the most relevant one is the`context` element.
110
+
####State YAML
108
111
109
-
####Context (`stateYaml.context`)
112
+
The State YAML of the build is represented in the`stateYaml` field of each JSON file.
113
+
It includes several fields, but the most relevant one is the`context` element.
110
114
115
+
#####Context (`stateYaml.context`)
111
116
*`workflowMetadata`: contains general information about the workflow (the build)
112
117
*`stepsMetadata`: contains information about every step executed in the build
113
118
114
-
####Workflow Metadata (`workflowMetadata`)
119
+
#####Workflow Metadata (`workflowMetadata`)
115
120
116
121
*`startTimestamp`: when the build process started. The next action is the execution of the pre-steps. Example:`2021-05-17T21:31:47.481Z`
117
122
*`preStepsStartTimestamp`: once the build starts, it first executes the “preSteps“ (including the “Initializing Process“ step). This field is the timestamp when those pre-steps start. The next action would be the execution of the actual build-steps. Example:`2021-05-17T21:31:48.118Z`. Relative position in time:`startTimestamp`<`preStepsStartTimestamp`
@@ -124,22 +129,26 @@ It’s composed of several fields, but the most relevant one is the `context` el
124
129
*`finishTimestamp`: timestamp indicating when the workflow is finished. Relative position in time:`preStepsFinishTimestamp`<`finishTimestamp`
125
130
*`totalTime`: Integer. Duration in milliseconds of the build. This contemplates the time from the moment the pre-steps started to the moment the last step is executed. This doesn’t include the time the build was waiting to start (pending), nor the time the**post** -steps of the build. Example:`33708`
126
131
127
-
####Steps Metadata (`stepsMetadata`)
132
+
#####Steps Metadata (`stepsMetadata`)
128
133
129
-
This object will have N Objects inside of it. Each of them representing a step in the build.
134
+
This object has N Objects within, each representing a step in the build.
135
+
The key of each object within`stepsMetadata` is the name of the step.
130
136
131
-
The key ofeach objectinside`stepsMetadata`will be the name of the step.
137
+
In general,each objectwithin`stepsMetadata`include:
132
138
133
-
In general, each object inside the`stepsMetadata` field will have:
134
-
135
-
*`type`: the type of the step. Possible values:`freestyle`,`build`,`push`,`parallel`, etc (anything you put in`type` when defining your step
139
+
*`type`: The step type. Possible values:`freestyle`,`build`,`push`,`parallel`, etc (anything you put in`type` when defining your step.
136
140
*`result`:
137
-
*`status` : same as`result`
141
+
*`status`: Same as`result`
138
142
*`totalTime`
139
143
140
-
Some clarification: the key`Initializing Process` is the one representing the Initializing Process, and it will only have`startTimestamp`,`status`,`finishTimestamp` and`totalTime`.
141
144
142
-
There’s a key called`Initializing` , it could be ignored.
145
+
{{site.data.callout.callout_tip}}
146
+
**TIP**
147
+
The key`Initializing Process` representing the pipeline initialization stage has these values:`startTimestamp`,`status`,`finishTimestamp` and`totalTime`.
148
+
The key`Initializing` can be ignored.
149
+
{{site.data.callout.end}}
150
+
151
+
143
152
144
153
###Example of a build JSON created by the script
145
154
@@ -226,11 +235,12 @@ There’s a key called `Initializing` , it could be ignored.
226
235
}
227
236
```
228
237
229
-
###Getting the build logs
238
+
##Get build logs
230
239
231
-
Most of the time the information provided by the State YAML is enough, but if you require to push the logs of builds as well, then, this call(s) may help you.
240
+
Generally,`State YAML` provides the information you need.
241
+
If you also need to push the build logs, we have these calls to help.
232
242
233
-
To get thelogsof a build`BUILD_ID`, you need to:
243
+
**Get buildlogsby`BUILD_ID`**:
234
244
235
245
```shell
236
246
BUILD_ID=123xyz
@@ -256,7 +266,7 @@ printf "\tDownloading logs to ${BUILD_ID}.json \n"
256
266
curl --silent$LOGS_URL --output${BUILD_ID}.json
257
267
```
258
268
259
-
The`<BUILD_ID>.json` filewill have the following structure:
269
+
This call results in a`<BUILD_ID>.json` filewith the following structure:
260
270
261
271
```json
262
272
...
@@ -274,10 +284,13 @@ steps:
274
284
...
275
285
```
276
286
277
-
`steps` is an array of Steps, and each of the elements has a`logs` array.
278
-
279
-
The`logs` arrayhas the contentof the logsfor a step in specific (a line per array element).
287
+
where:
288
+
*`steps` is an array of steps, with each element including a`logs` array.
289
+
*The`logs` arrayincludes thelogcontent for a step,a line per array element.
280
290
281
-
It’s important to mention that the calls above are only valid for builds that have already been finished (successful, failure, terminated)
282
291
283
-
You can use the script suggested at the beginning of this document to iterate over all the builds on a timeframe and get the logs from them.
292
+
{{site.data.callout.callout_tip}}
293
+
**TIP**
294
+
The calls above are only valid for_completed_ successful, failure, or terminated builds.
295
+
You can use the script at the beginning of this article to iterate over all the builds over a time frame and get the logs from them.
Copy file name to clipboardExpand all lines: _docs/kb/articles/mount-volumes-in-composition-step.md
+28-10Lines changed: 28 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,17 +11,29 @@ categories: [Pipelines]
11
11
support-reviewed:2023-04-18 LG
12
12
---
13
13
14
+
This article describes how to mount a volume within a named Docker volume to overcome the Docker limitation.
15
+
16
+
>**NOTE**
17
+
This article is relevant when`docker-compose.yml` file in the`composition` step.
18
+
19
+
20
+
21
+
14
22
##Overview
15
23
16
-
Compositions steps can only mount {% raw %}`${{CF_VOLUME_NAME}}:${{CF_VOLUME_PATH}}`{% endraw %}, but you need to mount`/codefresh/volume/<REPO>/<DIR>/` to`/<DIR>` . This is a limitation of docker where you cannot specify a directory inside a named docker volume[1].
24
+
Compositions steps can only mount {% raw %}`${{CF_VOLUME_NAME}}:${{CF_VOLUME_PATH}}`{% endraw %}.
25
+
26
+
If you need to mount`/codefresh/volume/<REPO>/<DIR>/` to`/<DIR>`, due to a Docker limitation, you cannot specify a directory inside a named Docker volume. See[here](https://github.com/moby/moby/issues/32582){:target="\_blank"}.
17
27
18
-
**Note** : this is for using a`docker-compose.yml` file in the composition step.
28
+
##How to
19
29
20
-
##Details
30
+
#####Scenario
21
31
22
-
I have in my git rep sample data and need to mount this to the`/database` directory. My custom image is expecting data to be in`/database` to be able to run. My docker file already runs a script at startup when the container starts.
32
+
* Sample data in Git repo must be mounted to`/database` directory.
33
+
* The custom image needs access to`/database` for proper execution.
34
+
* The Dockerfile already runs a script on container startup.
23
35
24
-
Sample`docker-compose.yml`
36
+
#####Sample`docker-compose.yml`
25
37
26
38
```yaml
27
39
version:'3.0'
@@ -32,7 +44,7 @@ services:
32
44
-./:/database
33
45
```
34
46
35
-
`Sample codefresh.yml`
47
+
#####Sample`codefresh.yml`
36
48
37
49
{% raw %}
38
50
@@ -80,11 +92,17 @@ steps:
80
92
volumes:
81
93
- ${{CF_VOLUME_NAME}}:${{CF_VOLUME_PATH}}
82
94
```
83
-
84
95
{% endraw %}
85
96
86
-
In the conform step, we are replacing the volumes mount to be {% raw %}`${{CF_VOLUME_NAME}}:${{CF_VOLUME_PATH}}`{% endraw %}. Then add a command of {% raw %}`bash -c "ln -s ${{CF_VOLUME_PATH}}/${{CF_REPO_NAME}}/<DIR>/ /database && ./start.sh"`{% endraw %} to symlink the directory in my repo to the `/database` directory and then execute my script that's already in the container. Once this is done, my composition steps will run and have the correct mounts and directories where it is needed.
97
+
In the `conform` step, `.arguments.KEYVALUE_PAIRS`:
98
+
* Volumes mount is replaced to {% raw %}`${{CF_VOLUME_NAME}}:${{CF_VOLUME_PATH}}`{% endraw %}.
99
+
* The command {% raw %}`bash -c "ln -s ${{CF_VOLUME_PATH}}/${{CF_REPO_NAME}}/<DIR>/ /database && ./start.sh"`{% endraw %}:
100
+
* Symlinks the directory in the Git repo to the `/database` directory
101
+
* Executes the script already in the container
102
+
103
+
Once this is done, `composition` steps will run and have the correct mounts and directories where needed.
87
104
88
-
## Related Items
105
+
## Related articles
106
+
[Composition steps in pipelines]({{site.baseurl}}/docs/pipelines/steps/composition/)
Copy file name to clipboardExpand all lines: _docs/kb/articles/which-pod-used-for-running-build.md
+11-5Lines changed: 11 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,5 @@
1
1
---
2
-
title:"How To: Knowwhich builder podwasusedfor running a build"
2
+
title:"How To: Knowthe builder pod usedto run build"
3
3
description:
4
4
group:kb
5
5
sub-group:articles
@@ -11,10 +11,16 @@ categories: [Pipelines]
11
11
support-reviewed:2023-04-18 LG
12
12
---
13
13
14
-
##Overview
14
+
This article describes how to retrieve the specific builder pod used to run a build.
15
15
16
-
By default codefresh on-premises uses codefresh builders to run your builds. In case you configured more than one builder, you might need to know on which builder pod specific build was run on for debug purposes for example in case of networking issues in your k8s cluster
16
+
##Multiple builders
17
+
By default, Codefresh on-premises uses Codefresh builders to run your builds.
17
18
18
-
##Details
19
+
If you configured more than one builder, you might need to know the builder pod that the specific build was run on, for debug purposes for example, to resolve networking issues in your k8s cluster.
20
+
21
+
To always locate the builder pod, simply output the builder pod name to the build logs.
22
+
23
+
##How to
24
+
25
+
* Add`echo $CF_HOST_NAME` to the commands list of one of the`freestyle` steps in the pipeline for which you need to track the builder name it runs on.
19
26
20
-
One of the ways you can use for this purpose is to output the builder pod name to the build logs so it can be easily found there in future. You just need to add`echo $CF_HOST_NAME` to the commands list of one of the freestyle steps in the pipeline for which you need to track the builder name it runs on.