Deploying Google Cloud Functions with Terraform

  Posted on March 8, 2022   ·   2 min read   ·   # comments   ·   #today-i-learned  #programming 

Cloud Functions are an easy, performant, and potentially inexpensive way to build serverless backends. I recently went down the route of setting up continuous deployments for them, and thought I’d share my learnings with it.

The easiest way to deploy a Cloud Function is using the gcloud CLI like so:

gcloud functions deploy YOUR_FUNCTION_NAME \
  --region=YOUR_REGION \
  --runtime=YOUR_RUNTIME \
  --source=YOUR_SOURCE_LOCATION \
  --entry-point=YOUR_CODE_ENTRYPOINT \
  TRIGGER_FLAGS

Things get a bit more complicated if you want to use Terraform for deployments, which has its own set of advantages. The main trick to getting it to work is in line 43 below, where a checksum is appended to the archive’s filename every time it’s uploaded to the storage bucket.

The reason this is necessary is because the google_cloudfunctions_function resource won’t be triggered for a redeployment on subsequent code changes - by having a checksum generated based on the source code, we ensure that Terraform knows to redeploy the Cloud Function whenever the underlying code changes.

 1locals {
 2  project_id = "my-project"
 3  region     = "us-west1"
 4  component  = "my-component"
 5
 6  cloud_function = {
 7    name        = "${local.component}-cf"
 8    description = "Some description for this cloud function"
 9    runtime     = "python39"
10    entry_point = "my_entry_point"
11
12    source_dir       = "./src"
13    archive_filepath = "/path/to/file"
14  }
15}
16
17# Service account for the Cloud Function
18resource "google_service_account" "cloud_function_sa" {
19  project      = local.project_id
20  account_id   = local.component
21  display_name = local.component
22}
23
24# Bucket to store the source code archives
25resource "google_storage_bucket" "function_archive" {
26    name     = "${local.component}-cloud-function-archive"
27    location = local.region
28    project  = local.project_id
29    uniform_bucket_level_access = true
30}
31
32# Archive the source code as a zip file
33data "archive_file" "function_archive" {
34  type        = "zip"
35  source_dir  = local.cloud_functions.source_dir
36  output_path = "${path.root}/${local.cloud_function.archive_filepath}"
37}
38
39# Upload the source code archive to the bucket. This will happen each time
40# the source code changes.
41resource "google_storage_bucket_object" "archive" {
42  # Append checksum so file changes trigger a cloud function deployment
43  name   = format(
44            "%s#%s",
45            local.cloud_function.archive_filepath,
46            data.archive_file.function_archive.output_md5
47          )
48  bucket = google_storage_bucket.function_archive.name
49  source = data.archive_file.function_archive.output_path
50
51  content_disposition = "attachment"
52  content_encoding    = "gzip"
53  content_type        = "application/zip"
54}
55
56# Cloud Function that uses the uploaded source code archive
57resource "google_cloudfunctions_function" "some_cloud_function" {
58  name        = local.cloud_function.name
59  description = "Some description for my cloud function"
60
61  project               = local.project_id
62  region                = local.region
63  source_archive_bucket = google_storage_bucket_object.archive.bucket
64  source_archive_object = google_storage_bucket_object.archive.name
65  service_account_email = google_service_account.cloud_function_sa.email
66
67  runtime               = local.cloud_function.runtime
68  entry_point           = local.cloud_function.entry_point
69  trigger_http          = true
70}
71


comments powered by Disqus