Integrate Netcore in Google BigQuery
Learn to integrate Netcore's CE dashboard with Google BigQuery.
The Netcore and Google BigQuery integration streamlines data transfer. It ensures fast and efficient data export for Netcore CE users. It facilitates exporting customer data from campaigns and journeys and enhances insight into ad campaign performance.
In this integration guide, you learn to integrate your Netcore CE dashboard with Google BigQuery.
Prerequisites
Ensure the following prerequisites are fulfilled to enable this integration.
- Ensure you have Google Cloud Storage (GCS) account with permissions to create a new bucket and a service account.
- Create a service account for Netcore.
- Navigate to the IAM & Admin Console on the GCS Cloud Console > Service Accounts > Create Service Account. Provide a name and a description for the service account.
- Click Create and continue, select Create a new private key, and choose the JSON format.
- Click Create to download the JSON key file to your system.
- Grant permissions to your bucket and provide the service account's email address from the details page.
- Navigate to the Cloud Storage Console and open the bucket where you want to export events.
- Select Add Members under the Permissions tab.
- Provide the service account's email in the New Principals field.
- Assign the role of Cloud Storage > Storage Admin and Save.
- Navigate to the GCS console to activate the Cloud Storage API .
- Select APIs & Services > Enable APIs and Services > Cloud Storage.
- Click Enable to grant access to the Cloud Storage API for your project.
Enable Google Cloud Storage Setup
Follow the steps below to enable Google Cloud Storage Setup on Netcore's CE dashboard.
- Log in to the Netcore CE dashboard and navigate to Settings > Integrations > Google Cloud Storage (GCS).
- Click on Integrate after the Integration modal appears.
- Provide the Connection name, Bucket Name and Bucket Region(optional field).
- Upload the previously downloaded Service Account Credential JSON file. The system automatically verifies your client ID and email.
- Select the events you want Netcore to export.
- Click Connect to complete the integration process.
Enable Netcore Integration in BigQuery
Required keys for integration. Refer to the table below for details on its usage.
Keys | Description |
---|---|
Service-Account-Key.json | Add the location of the JSON file containing GCP service account key credentials. |
Your-project-id | Add our GCP project ID. |
Your-dataset-id | Add the ID of the dataset where the data has to be uploaded. |
Your-table-id | Add the ID of the table where the data needs to be uploaded. |
Data.csv | Add the path to the CSV file that contains the data which has to be uploaded. |
Upon data flowing into GCS, proceed to BigQuery.
- Ensure your GCS account has the Project Owner role for BigQuery access. Refer to the BigQuery user guide for IAM role details.
- Enable GCS Data Exports for your account and include at least one attribute from device_attributes or user_attributes in the export format.
Follow these steps to schedule automatic data ingestion from your Google Cloud Storage (GCS) buckets into Google BigQuery using Cloud Storage Transfer Service:
- Ensure your GCS setup is completed as instructed above.
- After data enters GCS, proceed with the following steps to define the schema for your imports.
Follow these steps Before initiating data transfer:
- Verify access to your assigned buckets by downloading a
.json
file to your device. - Uncompress the file to check its contents.
- Enable data transfer in BigQuery.
- Create a dataset to store your data.
- Retrieve the Cloud Storage Unique Resource Identifier (URI).
Set Permissions in BigQuery
- You need permissions to load data into a new or existing table or partition.
- If loading data via Cloud Storage, ensure you have access to the relevant bucket.
Make sure you have set up the necessary permissions in your BigQuery console.
bigquery.transfers.update
: Needed to create the transfer.
bigquery.datasets.get
andbigquery.datasets.update
: Required on the target dataset.bigquery.admin
is a predefined IAM role which consists of:bigquery.transfers.update
,bigquery.datasets.update
andbigquery.datasets.get
permissions. Click here know more about the IAM role in BigQuery Data Transfer Service.- For cloud storage, the permission
storage.objects.get
is required for the individual bucket. - You need the
storage.objects.get
permission for the specific Cloud Storage bucket.
storage.objects.get
: Needed to access the bucket.
storage.objects.list
: Required if using URI wildcard.
storage.objects.delete
: Necessary if you want to delete source files after a successful transfer. Click here to know more.
After you set the permissions, create a new table in BigQuery to store your data. Follow the steps given below to create a new table in BigQuery:
Follow the schema guidelines detailed here.
- Navigate to BigQuery console > Google Cloud Storage.
- Choose the
.jsonl
file from the GCS bucket. - Provide the appropriate Project, Dataset, and Table details.
- Navigate to Schema > Auto detect.
- Adjust other specifications as needed, then click on Create Table and verify data load.
Follow the steps given below to verify data load:
- Navigate to the Explorer Dashboard.
- Check the schema and preview the data to ensure it has been loaded correctly.
Set Up the Transfer
Follow the instructions here to configure the Cloud Storage Data Transfer Service.
- Adjust configurations as needed and click on Create Table.
By following the steps mentioned above, you can ensure that your table is set up correctly. Additionally, your data will be loaded and verified in BigQuery.
Export Reports
Follow these steps to export reports for campaigns and journeys.
1. Campaign Report Export
- Log into the Netcore CE dashboard and navigate to the Campaigns > Download Reports.
- Select the specifications for the type of data to be exported.
- Choose the method of sending data from the dropdown menu, select GCS and enter the Connection Name.
2. Journey Report Export
- Log into the Netcore CE dashboard and navigate to the Journeys > Download Reports.
- Select the specifications for the type of data to be exported.
- Choose the method of sending data from the dropdown menu, select GCS and enter the Connection Name.
FAQs
Here are the most frequently asked questions about Google BigQuery and Netcore integration.
Why use Google Cloud Storage (GCS) instead of AppFlow for BigQuery integration?
Google Cloud Storage (GCS) offers more robust features and capabilities than AppFlow for BigQuery integration.
Why do GCS and BigQuery work well together?
GCS and BigQuery belong to the same ecosystem, ensuring seamless integration and better performance.
Why choose GCS for BigQuery integration over AppFlow?
Google Cloud Storage (GCS) provides superior capabilities and is widely used by the competition for BigQuery integration.
Updated 22 days ago