Skip to Content

Setting up a GCP Integration

CloudQuery Platform authenticates with GCP through Service Accounts using JSON key files.

The GCP integration supports additional authentication methods — including Workload Identity and Workload Identity Federation — when used with the CloudQuery CLI. CloudQuery Platform does not yet support these methods through the UI. See the GCP integration documentation for the full list of authentication options.

Prerequisites

  • A CloudQuery Platform account with admin access
  • A GCP project with permissions to create service accounts and manage IAM
  • Access to the GCP Console

Step 1: Set up a service account

CloudQuery uses a service account to read resources from your GCP environment. Follow these steps to set up a new service account with read-only access:

  1. Open the GCP Service Accounts page
  2. Select the project to create the service account in (you can assign access to other projects later)
  3. Click Create Service Account
  4. Enter the details:
    1. Service account display name, e.g. CloudQuery Readonly
    2. Service account ID, e.g. cloudquery-readonly
    3. A description, e.g. Service account for CloudQuery to fetch resources in GCP
    4. Click Create and Continue

Setting up a service account

  1. Under Basic, select the Viewer role for the service account.

Selecting the Viewer role

  1. Click Continue and Done.
  2. In the service accounts list, click on the new service account, then go to the Keys tab. Click Add KeyCreate New Key.
  3. Select JSON and click Create. This downloads a JSON key file to your computer. You need this file when configuring the integration in CloudQuery Platform.

Getting a private key

Store the JSON key file securely. It contains credentials that grant access to your GCP resources. You will upload the contents to CloudQuery Platform in Step 2, after which you can delete the local file.

Optional: Assign organization or folder-wide access

To sync resources across all your GCP projects, grant the Viewer role to the service account at the organization or folder level:

  1. In the GCP Console project selection screen, select your top-level organization (or folder)
  2. Go to IAM and AdminIAM, and click Grant Access
  3. Paste the email address of the service account you created above in the New Principals text box. Assign the Viewer role.
  4. Click Save

Assign Organization or folder-wide access to the Service Account

Optional: Assign access to individual projects

You can also grant access to specific projects if you don’t need organization-wide access:

  1. In the GCP Console project selection screen, select the relevant project
  2. Go to IAM and AdminIAM, and click Grant Access
  3. Paste the email address of the service account in the New Principals text box. Assign the Viewer role.
  4. Click Save

Step 2: Create the integration

  1. In CloudQuery Platform, go to Data PipelinesIntegrations. Click Create Integration and type GCP to find the GCP integration.

CloudQuery Platform Create Integration page with GCP selected from the integration search results

  1. Choose a name for your integration (e.g. GCP) and update the YAML configuration. Here is a complete example:
kind: source spec: name: gcp path: cloudquery/gcp registry: cloudquery version: "v22.0.0" tables: - gcp_compute_instances - gcp_storage_buckets spec: service_account_key_json: | ${SERVICE_ACCOUNT_KEY_JSON}

The pipe symbol (|) followed by a newline in the YAML is required for the service_account_key_json field. Without it, YAML parsing will not work as expected because the JSON key contains special characters.

The tables list above is an example. Customize it to include the tables you need. See the full GCP table list for all available tables. Use ["*"] to sync all tables.

  1. Add a new secret with Key SERVICE_ACCOUNT_KEY_JSON.
  2. In a text editor, open the JSON key file you downloaded from GCP in Step 1, and copy-paste the entire contents into the Value field:

Secrets section with the service account key JSON secret and credentials pasted into the value field

  1. Click Test Connection to verify the setup.

After a successful test connection, you can safely delete the JSON key file from your local disk. The credentials are stored securely in CloudQuery Platform.

What gets synced

The GCP integration can sync hundreds of tables across GCP services. Some of the most commonly used tables include:

CategoryTablesDescription
Computegcp_compute_instances, gcp_compute_disksVM instances, persistent disks
Storagegcp_storage_bucketsCloud Storage buckets
Networkinggcp_compute_networks, gcp_compute_firewalls, gcp_compute_subnetworksVPCs, firewall rules, subnets
Containersgcp_container_clustersGKE clusters
Identitygcp_iam_service_accountsIAM service accounts

See the full GCP table list for all available tables.

Verify the integration

After your first sync completes, open the SQL Console and run these queries to confirm data arrived:

-- Count synced compute instances SELECT count(*) FROM gcp_compute_instances
-- List synced GCP projects SELECT DISTINCT project_id FROM gcp_compute_instances
-- View storage buckets SELECT project_id, name, location FROM gcp_storage_buckets LIMIT 10

You can also browse your GCP resources in the Asset Inventory under the Compute, Storage, Networking, and other categories.

Troubleshooting

IssueCauseFix
Invalid JSON keyMalformed or incomplete JSONVerify you copied the entire contents of the JSON key file, including the opening { and closing }. Re-download the key from GCP if needed.
Permission deniedService account lacks Viewer roleVerify the service account has the Viewer role on the project, folder, or organization you want to sync.
Project not foundService account not granted accessThe service account can only access projects where it has been granted IAM permissions. Grant the Viewer role on each project, or at the organization/folder level.
API not enabledRequired GCP API is disabledSome resources require specific APIs to be enabled in the project (e.g., Compute Engine API, Cloud Storage API). Enable them in the GCP API Library.
No data for some resourcesTables not specifiedCheck that the relevant tables are listed in the tables field of your YAML configuration.

Next steps

Last updated on