Manage Google Cloud Service Accounts with Vault and Terraform

Posted on 52 views

Before I proceed with the guide, I would like to tip my hat to all of the people around the world who continue to produce and share tools that make the work of teams across the globe easy and less laborious. You are doing an amazing job and thank you!!

Back to the guide, I could not do a better introduction of what we are going to do today than an excerpt I got from Google Documentation. Here it goes:

The Google Cloud Vault secrets engine dynamically generates Google Cloud service account keys and OAuth tokens based on IAM policies. This enables users to gain access to Google Cloud resources without needing to create or manage a dedicated service account.

The benefits of using this secrets engine to manage Google Cloud IAM service accounts are:

  • Automatic cleanup of GCP IAM service account keys: Each Service Account key is associated with a Vault lease. When the lease expires (either during normal revocation or through early revocation), the service account key is automatically revoked.
  • Quick, short-term access: Users do not need to create new GCP Service Accounts for short-term or one-off access (such as batch jobs or quick introspection).
  • Multi-cloud and hybrid cloud applications: Users authenticate to Vault using a central identity service (such as LDAP) and generate GCP credentials without the need to create or manage a new Service Account for that user.

Manage Google Cloud Service Accounts with Vault and Terraform

Well, I really hope you now understand what we are going to do here.

Pre-requisites:

You will need the following for everything to work well.

  • Running Vault server
  • Terraform
  • GCP Subscription
  • Bitbucket or any other ci/cd implementation if you want pipelines

Most secrets engines must be configured in advance before they can perform their functions. These steps are usually completed by an operator or configuration management tool. Here are the steps to get us up and running.

Step 1: Generate GCP Credentials

Generate the credentials JSON file and copy it to vault server. We are going to use it to configure our Secrets Engine so that it can communicate to GCP without any qualms. These are the permissions Vault requires

iam.serviceAccounts.create
iam.serviceAccounts.delete
iam.serviceAccounts.get
iam.serviceAccounts.list
iam.serviceAccounts.update
iam.serviceAccountKeys.create
iam.serviceAccountKeys.delete
iam.serviceAccountKeys.get
iam.serviceAccountKeys.list

After you have the credentials file with the correct permissions/roles, copy it to vault server

scp mycredentials.json [email protected]:~

Step 2: Enable the Google Cloud secrets engine:

Login to vault server

$ ssh [email protected]

Then run the following commands to configure/enable the Google Cloud secrets engine

$ vault secrets enable gcp
Success! Enabled the gcp secrets engine at: gcp/

By default, the secrets engine will mount at the name of the engine. To enable the secrets engine at a different path, use the -path argument.

Configure the secrets engine with account credentials, or leave blank or unwritten to use Application Default Credentials.

$ vault write gcp/config [email protected]
Success! Data written to: gcp/config

Step 3: Configure rolesets or static accounts.

Now our Vault server and scp secrets engine has been successfully configured as easily as that. What remains is to figure out how to now add and delete service accounts with the as much finesse as we can afford. We checked our coffers and Terraform notes were glaring at us with glee. We could not wait to pick them up!

So in this section, we shall leverage on the power of Terraform to offload our burden of logging into vault all the time. Let us get to it already!

First let us create the following directory structure:

├── vault-gcp-service-accounts
│   └── modules
│       └── rolesets

Simply do the following

$ mkdir -p vault-gcp-service-accounts/modules/rolesets

Then within “vault-gcp-service-accounts” directory, create the following files with they contents as illustrated below

First file:

$ vim main.ft

provider "vault" 
   token   = var.vault_token
   alias   = “geeks_rolesets"
   address = var.geeks_vault_address


## Module for Adding RoleSets

module "rolesets" 
  source                            = "./modules/rolesets"
  providers = 
    vault = vault.geeks_rolesets
  


## This is the bucket where tfstate will be stored in GCP. Make sure you create it beforehand.
terraform 
  backend "gcs" 
    bucket      = “geeks-terraform-state-bucket"
    prefix  = "terraform/vault_rolesets_state"
  

Second file:

$ vim vars.tf

variable “geeks_token" 
  type        = string
  default     = "s.6fwP0qPixIajVlD3b0oH6bhy"
  description = “This is dummy token”


variable “geeks_vault_address” 
  type        = string
  default     = "//vault.computingpost.com:8200"
  description = “This is vaults address”

Save the files and now navigate into “rolesets” directory and therein, add the following two files as well

First file:

$ cd modules/rolesets
$ vim provider.tf

terraform 
  required_providers 
    vault = 
      source = "vault"
      version = "~> 3.0"
    
  

Second file: This is the file where new rolesets will be added and removed

$ vim rolesets.tf

locals 
  project = “geeksproject”


## Example/test service account/roleset creation

resource "vault_gcp_secret_roleset" "roleset" 
  backend      = "gcp"
  roleset      = "project_viewer"
  secret_type  = "access_token"
  project      = local.project
  token_scopes = ["https://www.googleapis.com/auth/cloud-platform"]

  binding 
    resource = "//cloudresourcemanager.googleapis.com/projects/$local.project"

    roles = [
      "roles/viewer",
    ]
  

Save the two files and now we are ready to rock and roll..

Step 4: Initialize Terraform

As you already know, we shall simply navigate to the root directory and initialise terraform so that all provider binaries will be installed. Run the following command within the “vault-gcp-service-accounts” folder.

$ terraform init

Give it some seconds to install all of the binaries. In case there are any errors here, check them out and fix.

Step 5: Run a plan

After the previous step is done, we can now “terraform plan” and see what we have. First, we need the valid token to enable terraform authenticate against our Vault server. Simply export it to your environment variable or if you do not care about security, you can paste it in the “vars.ft” file we created in the root directory, that is “vault-gcp-service-accounts”.

$ export TF_VAR_geeks_token=

Then run the magic command:

$ terraform plan -out create_roleset -var geeks_token=$TF_VAR_geeks_token

Terraform used the selected providers to generate the following execution
plan. Resource actions are indicated with the following symbols:
  + create
Terraform will perform the following actions:
  # module.rolesets.vault_gcp_secret_roleset.roleset will be created
  + resource "vault_gcp_secret_roleset" "roleset" 
      + backend               = "gcp"
      + id                    = (known after apply)
      + project               = "geeksproject"
      + roleset               = "project_viewer"
      + secret_type           = "access_token"
      + service_account_email = (known after apply)
      + token_scopes          = [
          + "https://www.googleapis.com/auth/cloud-platform",
        ]
      + binding 
          + resource = "//cloudresourcemanager.googleapis.com/projects/geeksproject"
          + roles    = [
              + "roles/viewer",
            ]
        
    
Plan: 1 to add, 0 to change, 0 to destroy.

Step 6: Knit it all together with Bitbucket Pipelines

Now that we are okay with setting it up locally, it is clear that we can knit it all using our good Bitbucket pipelines. Login to Bitbucket and create a repository. Before leaving, create an environment variable in the workspace that will hold the vault token, as well as your GCP API JSON we shall be using. Once that is done clone the repository and add the directories and files we already have. After you are done, create “bitbucket-pipelines.yml” at the root of your repository.

$ vim bitbucket-pipelines.yml

image: penchant/cf-terraform:latest
pipelines:
  branches:
    main:
      - step:
          name: Deploy to Vault and GCP
          deployment: production
          script:
            - cd vault-gcp-service-accounts
            - echo $GCLOUD_API_KEYFILE | base64 -d  > ./your-gcp-cloud-api-key.json
            - gcloud auth activate-service-account --key-file your-gcp-cloud-api-key.json
            - export GOOGLE_APPLICATION_CREDENTIALS=your-gcp-cloud-api-key.json
            - terraform init
            - terraform plan -out create_roleset -var geeks_token=$TF_VAR_geeks_token
            #- terraform force-unlock -force 1636379553679123
            - terraform apply -auto-approve create_roleset
          services:
            - docker

And we are pretty much done with it. Git add, git commit and git push. Create a pull request then merge after what happens in your organization happens.

Last strike

It is another successful guide and we hope it is going to be helpful to those who would wish to the same kind of thing presented here. For improvements, kindly comment so that we can all grow together. Finally I cannot exit without sending out my gratitude and special thanks to all that continue to support us and for all of the good and encouraging comments we continue to receive. Another hat tilt to you all. Cheers!!

coffee

Gravatar Image
A systems engineer with excellent skills in systems administration, cloud computing, systems deployment, virtualization, containers, and a certified ethical hacker.