Cloud Build - Scheduled Builds

It’s often useful to automate a rebuild of the container images in order to pull in fixes and updates available in the base images.

In this tutorial you will learn how to build an automated pipeline to ensure that you include security updates and patches in your container images.

This implementation utilizes four main components:

  • A Git repository to hold your custom image definitions
  • An Artifact Registry repository to store and scan your final images
  • A Cloud Build build trigger to perform the build
  • The Cloud Scheduler to execute the job on a set schedule

Objectives

Create an Artifact Registry repository to store and scan your custom image.

  • Configure GitHub with Google Cloud to store your image configurations.
  • Create a Cloud Build trigger to automate creation and deployment of custom images to Artifact Registry.
  • Configure Cloud Scheduler to initiate builds on a regular basis.
  • Review the results of the automated processes.

Before you begin

Enable APIs

Enable the Artifact Registry, Container Scanning, Cloud Build, and Cloud Scheduler services.

gcloud services enable \
	container.googleapis.com \
	cloudbuild.googleapis.com \
	containerregistry.googleapis.com \
	containerscanning.googleapis.com \
	artifactregistry.googleapis.com \
	cloudscheduler.googleapis.com

Set environment variables

Set the project ID for the cloud project you will be using:

PROJECT_ID=[your project id]

Set the GitHub user name where your repository will be stored:

GITHUB_USER=[your github id]

Set the PROJECT_NUMBER and REGION variables to use through the process. For more information, see the list of the available regions.

PROJECT_NUMBER=$(gcloud projects describe $PROJECT_ID \
    --format='value(projectNumber)')

REGION=us-central1

Creating the Artifact Registry repository

In this tutorial you use Artifact Registry to store and scan your images. Create the repository with the following command.

gcloud artifacts repositories create custom-images \
  --repository-format=docker \
  --location=$REGION \
  --description="Docker repository"

Configure Docker to utilize your gcloud CLI credentials when accessing Artifact Registry.

gcloud auth configure-docker $REGION-docker.pkg.dev

Note: Keep in mind that this configuration also enables automated container scanning, which provides vulnerability scanning but has an extra price per scanned image (see pricing details). To disable Container Analysis, run the following command

gcloud services disable containerscanning.googleapis.com

Preparing the Git repository

In practice, you keep the Dockerfile for your custom images in a Git repo. The automated process accesses that repo during the build process to pull the relevant configs and Dockerfile.

Fork the sample repository

For this tutorial, fork a sample repo that provides the container definitions used in this lab.

  1. Click this link to fork the repo.
  2. If prompted, sign in to GitHub.
  3. Select your GitHub username as the Owner.
    The Repository name appears as software-delivery-workshop.
  4. Click Create fork and wait a few seconds for the process to complete.

Connect Cloud Build to GitHub

Next, connect that repository to Cloud Build using the built-in GitHub connection capability. Click the link below and follow the instructions describing how to complete the process. You do not need to create the trigger in the last step of the wizard, skip the last steps for now because you can do that later from the command line.

If you use a different Git repository solution, you can also follow the instructions to connect Cloud Build to GitLab or BitBucket.

Preparing Cloud Build

The sample repository contains a container definition and a CloudBuild configuration used to build the container image. In this step you create a CloudBuild trigger that runs the instructions in the cloudbuild.yaml file that you can find in the labs/cloudbuild-scheduled-jobs/code-oss-java folder.

gcloud beta builds triggers create manual \
	--name=custom-image-trigger \
	--repo=${GITHUB_USER}/software-delivery-workshop \
	--repo-type=GITHUB \
	--branch=main \
	--build-config=labs/cloudbuild-scheduled-jobs/code-oss-java/cloudbuild.yaml \
	--substitutions=_REGION=us-central1,_AR_REPO_NAME=custom-images,_AR_IMAGE_NAME=code-oss-java,_IMAGE_DIR=labs/cloudbuild-scheduled-jobs/code-oss-java

TRIGGER_ID=$(gcloud beta builds triggers list \
	--filter=name="custom-image-trigger" --format="value(id)")

This example configures the following:

  • The gcloud command creates a manual trigger within Cloud Build named custom-image-trigger as indicated by the name flag on the second line.
  • The next three lines contain flags related to the source github repo, including the file path to the repo, the type of the repo, and the git branch to build.
  • The build-config flag indicates the file path to the Cloud Build filein the git repository.
  • To make the job dynamic, the substitutions flag. For this job, the command passes in variables for the region, Artifact Registry name, a container image name, and the location of the docker file to build as indicated by the _IMAGE_DIR variable. View the cloudbuild.yaml file to see how these variables are used in the process.
  • After the trigger is created, the unique name of the trigger is retrieved and stored in the TRIGGER_ID environment variable for later use.

Configuring Cloud Scheduler

To ensure your images are up to date with the latest updates and patches, use Cloud Scheduler to execute the Cloud Build trigger on a set frequency. For this tutorial, the job runs every day. In practice, set this to a frequency aligned to your organizational needs to ensure the latest updates are always included.

  1. Grant a required role to default service account to invoke Cloud Build trigger:
gcloud projects add-iam-policy-binding ${PROJECT_ID} \
  --member="serviceAccount:$PROJECT_NUMBER-compute@developer.gserviceaccount.com" \
  --role="roles/cloudbuild.builds.editor"
  1. Grant a required role to Cloud Build service account to upload images to Artifact Registry:
gcloud projects add-iam-policy-binding ${PROJECT_ID} \
    --member=serviceAccount:${PROJECT_NUMBER}@cloudbuild.gserviceaccount.com \
    --role="roles/artifactregistry.admin"
  1. Create the Cloud Scheduler job with the following command:
gcloud scheduler jobs create http run-build \
	--schedule='0 1 * * *' \ 
	--uri=https://cloudbuild.googleapis.com/v1/projects/${PROJECT_ID}/locations/global/triggers/${TRIGGER_ID}:run \
	--location=us-central1 \
	--oauth-service-account-email=${PROJECT_NUMBER}-compute@developer.gserviceaccount.com \
	--oauth-token-scope=https://www.googleapis.com/auth/cloud-platform

The job is set to execute every day, however to test the functionality immediately run the job manually from the console:

  • On the Cloud Scheduler page find the entry you just created called run-build.
  • In the Actions column, click [the three dots] **More **for that row.
  • Click Force a job run to test the system manually.
  • After the command successfully executes, switch to the Cloud Build history page to review the progress.

Reviewing the results

Because we enabled the Container Scanning API as part of the setup process, Artifact Registry automatically scans the images for security vulnerabilities.

To review the vulnerabilities:

  1. Open the Repositories page.
  2. In the repositories list, click a repository.
  3. Click an image name.
    Vulnerability totals for each image digest appear in the Vulnerabilities column. \

  1. To view the list of vulnerabilities for an image, click the link in the Vulnerabilities column.
    The vulnerability list shows the severity, availability of a fix, and the name of the package that contains the vulnerability.