- DarkLight
To configure BigQuery, create a service account with the BigQuery User role. Grant necessary permissions at the project and dataset levels. Create a staging bucket in Cloud Storage matching the BigQuery dataset location. Assign the Storage Admin role to the service account. Connect Gong to BigQuery by entering project details, dataset, bucket name, region, service account email, and JSON key in Gong settings. This process enables data loading and management in BigQuery efficiently.
# BigQuery Configuration Steps
Step 1: Create service account in BigQuery project
- In the GCP console, navigate to the IAM & Admin menu, click into the Service Accounts tab, and click Create service account at the top of the menu.
- In the first step, name the user and click Create and Continue.
- In the second step, grant the user the role BigQuery User.
📘 Understanding the BigQuery User role
The BigQuery User role is a predefined IAM role that allows for the creation of new datasets, with the creator granted BigQuery Data Owner on the new dataset.
If you would like to avoid using the BigQuery User role, the minimum required permissions are:
- On the Project level:
bigquery.datasets.create
bigquery.datasets.get
bigquery.jobs.create
Note: These minimum permissions assume that the dataset has not been created ahead of time. If you create the dataset ahead of time, see the following note.
📘 Loading data into a Dataset that already exists
By default, a new dataset (with a name you provide) will be created in the BigQuery project. If instead you create the dataset ahead of time, you will need to grant the BigQuery Data Owner role to this Service Account at the dataset level.
In BigQuery, click on the existing dataset. In the dataset tab, click Sharing, then Permissions. Click Add Principals. Enter the Service Account name, and add the Role: BigQuery Data Owner
Specifically, the minimum permissions required can be granted to the principal and applied to the Dataset:
bigquery.tables.create
bigquery.tables.delete
bigquery.tables.get
bigquery.tables.getData
bigquery.tables.list
bigquery.tables.update
bigquery.tables.updateData
bigquery.routines.get
bigquery.routines.list
On the Project level, you will still need
bigquery.jobs.create
, but you will not needbigquery.datasets.create
orbigquery.datasets.get
.
- In the third step (Grant users access to this service account step), click Done.
- Back in the Service accounts menu, click the Actions dropdown next to the newly created service account and click Manage keys.
- Click Add key and then Create new key.
- Select the JSON Key type and click Create and make note of the key that is generated.
Step 2: Create a staging bucket
- Log into the Google Cloud Console and navigate to Cloud Storage. Click Create to create a new bucket.
- Choose a name for the bucket. Click Continue. Select a location for the staging bucket. Make a note of both the name and the location (region).
📘 Choosing a
location
(region)The location you choose for your staging bucket must match the location of your destination dataset in BigQuery. When creating your bucket, be sure to choose a region in which BigQuery is supported (see BigQuery regions)
- If the dataset does not exist yet, the dataset will be created for you in the same region where you created your bucket.
- If the dataset does exist, the dataset region must match the location you choose for your bucket.
- Click continue and select the following options according to your preferences. Once the options have been filled out, click Create.
- On the Bucket details page that appears, click the Permissions tab, and then click Add.
- In the New principles dropdown, add the Service Account created in Step 1, select the Storage Admin role, and click Save.
📘 Understanding the Storage Admin role
The Storage Admin role is a predefined IAM roles that allow for the creation of new storage objects and reading of those objects when loading into BigQuery. If you would like to avoid using the Storage Admin role, the minimum required permissions are:
- storage.buckets.get
- storage.objects.create
- storage.objects.delete
- storage.objects.get
- storage.objects.list
Step 3: Connect Gong to BigQuery
Once you have configured BigQuery, use the fields to set up the connection in Gong:
- In Gong, go to Company settings > Data Cloud > BigQuery.
- Enter the following details
- Project ID
- Dataset
- Bucket name
- Bucket region
- Service account email
- Service account JSON key
- Click Connect. It may take a few minutes to complete the connection.