Pre-Onboarding for GCP Parent Billing Accounts

Introduction

This user guide will explain how to perform the pre-onboarding steps required for onboarding a GCP Parent Billing Account into the platform.

Pre-onboarding

GCP Projects can be onboarded as a parent billing account. Onboarding a parent billing account allows you to discover the cost information for a parent account and all linked GCP Projects.

However, before your GCP project can be onboarded into the platform there are certain prerequisites that need to be set up.

GCP Project Permissions

The following permissions must be configured in your GCP Project before onboarding.

API access:

  • Enable API Access for Cloud Resource Manager API, Cloud Billing API, Security Command Center API in the API & Services – Library screen.

User account permissions:

A user account must be created with the following permissions:

  • For Assessment: Project Viewer (Read only).
  • For Assessment + Governance: Project Editor (View and Modify).
  • Security Command Center Access: Either Security Center Admin or Security Center Admin Viewer role is required for security vulnerability and compliance.
  • Operations Governance: Logging Admin & Pub/Sub Admin.

Service account permissions:

A service account must be created with the following permissions:

  • For Assessment: Project Viewer (Read only).
  • For Assessment + Governance: Project Editor (View and Modify).
  • Security Command Center Access: Either Security Center Admin or Security Center Admin Viewer role is required for security vulnerability and compliance.
  • Operations Governance: Logging Admin & Pub/Sub Admin.

Parent Billing Account Prerequisites:

  • Schedule queries in the GCP BigQuery console.
  • Create a Bucket for BigQuery data transfer (under the same GCP Project where BigQuery exists).

📘

NOTE: If threats and advanced security health analytic policies are required then the Security Command Center premium tier needs to be enabled.

Retrieve Onboarding Information from the GCP Console

Based on the authentication protocol being used in the platform (refer to the options below for guidance), certain information must be retrieved from the GCP console.

OAuth2 Protocol:

The following values must be generated/copied from your GCP Project and configured in the platform:

Client ID & Client Secret:

  1. Login to the GCP console.
  2. Navigate to the Credentials screen.
  3. Click Create credentials and select OAuth client ID.
  4. Select Web application in the Application type field.
  5. Specify the following URI in the Authorized redirect URIs field by clicking the Add URI button:
https://corestack.io/
  1. Click theCreate button. The Client ID and Client Secret values will be displayed.

Scope:

The OAuth 2.0 scope information for a GCP project can be found here: https://www.googleapis.com/auth/cloud-platform.

Project ID:

The project ID is a unique identifier for a project and is used only within the console.

  1. Navigate to the Projects screen in the GCP console.
  2. The Project ID will be displayed next to your GCP project in the project list.

Redirect URI:

The following redirect URI that is configured while creating the Client ID and Client Secret must be used:

https://corestack.io/

Authorization Code:

The authorization code must be generated with user consent and required permissions.

  1. Construct a URL in the following format:
https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=<Client ID>&redirect_uri=<Redirect URI>&scope=https://www.googleapis.com/auth/cloud-platform&prompt=consent&access_type=offline
  1. Open a browser window in private mode (e.g. InPrivate, Incognito) and use it to access the above URL.
  2. Login using your GCP credentials.
  3. The page will be redirected to the Redirect URI, but the address bar will have the Authorization Code specified after code=.

📘

Note:

The values retrieved in the earlier steps can be used instead of <Client ID> and <Redirect URI> specified in the URL format.

Copy these details and provide them while onboarding your GCP Project into the platform when using the OAuth2 protocol option.

Service Account Protocol:

A service account must be created in your GCP Project. Then, you need to create a service account key and download it as a JSON file. Also, the Project ID must be retrieved from your GCP Project.

How to Download the Credentials File (JSON):

  1. Navigate to the Credentials screen.
  2. Click Create credentials and select Service account. The Create service account page appears.
  3. Provide the necessary details to create a service account: Name, ID, Description.
  4. Click the Create button.
  5. Click Select a role to select the required roles.
  6. Click the Continue button.
  7. Click Create key.
  8. Select JSON as the Key type.
  9. Click the Create button. A JSON key file will be downloaded.
  10. Click Done.

Project ID:

Refer to the steps in the Project ID topic in the previous OAuth2 Based section.

Provide the JSON and Project ID while onboarding the GCP Project in the platform when using the Service Account protocol option.

Additional Values from the Parent Billing Account:

In addition to the prerequisites explained earlier, there are a few additional values that must be generated/copied from your GCP Parent Billing Account and configured in the platform.

Bucket Name:

  1. Login to the GCP console.
  2. Navigate to the Storage - Browser screen.
  3. Click Create bucket. The Create bucket screen appears.
  4. Provide a unique value in the Name your bucket field along with the other details required to create the bucket.
  5. Click the Create button.
  6. Copy the value provided in the Name your bucket field.

Billing Account ID:

  1. Login to the GCP console.
  2. Navigate to the Manage Billing Accounts screen.
  3. Click My Projects. A list of projects will be displayed.
  4. Copy the Billing Account ID for the required projects.

Set Up Cloud Billing Data Export to BigQuery:

To enable Cloud Billing data export to BigQuery, refer to the GCP configuration guide.

Provide these details in the platform for your Parent Billing Account onboarding, along with either the OAuth2 or Service Account information explained above, based on your Authentication Protocol selection.

Schedule Queries in GCP

Navigate to the Schedule Query Page in GCP, then follow the below steps:

  1. Login to the GCP console.
  2. Navigate to the GCP BigQuery console.
  3. On the left menu, click Schedule Queries. The schedule queries list appears.

  1. Click Create Schedule Queries located at the top of the page.

  1. Copy each schedule query (Daily Schedule Query, Monthly Schedule Query and On-demand Schedule Query).
  2. For the Daily Schedule Query, schedule it on an hourly basis.

  1. For the Monthly Schedule Query, schedule it on the fifth of every month.

  1. For the On-demand Schedule Query, run the query in real time.

🚧

Note:

You need to insert your own dataset id and bucket name into the code snippets below.

Daily Schedule Query

DECLARE
  unused STRING;
DECLARE
  current_month_date DATE DEFAULT DATE_SUB(@run_date, INTERVAL 0 MONTH);
DECLARE
  cost_data_invoice_month NUMERIC DEFAULT EXTRACT(MONTH
  FROM
    current_month_date);
DECLARE
  cost_data_invoice_year NUMERIC DEFAULT EXTRACT(YEAR
  FROM
    current_month_date);
EXPORT DATA
  OPTIONS ( uri = CONCAT('gs://<your_bucket_name>/', CAST(cost_data_invoice_year AS STRING), '-', CAST(current_month_date AS STRING FORMAT('MM')), '/*.csv'),
    format='JSON',
    overwrite=True) AS
SELECT
  *, (SELECT STRING_AGG(display_name, '/') FROM B.project.ancestors) organization_list
FROM
  `<Your Complete Data setDataset ID>` as B
WHERE
  B.invoice.month = CONCAT(CAST(cost_data_invoice_year AS STRING), CAST(current_month_date AS STRING FORMAT('MM')))
  AND B.cost != 0.0

Monthly Schedule Query

DECLARE
  unused STRING;
DECLARE
  current_month_date DATE DEFAULT DATE_SUB(@run_date, INTERVAL 1 MONTH);
DECLARE
  cost_data_invoice_month NUMERIC DEFAULT EXTRACT(MONTH
  FROM
    current_month_date);
DECLARE
  cost_data_invoice_year NUMERIC DEFAULT EXTRACT(YEAR
  FROM
    current_month_date);
EXPORT DATA
  OPTIONS ( uri = CONCAT('gs://<your_bucket_name>/', CAST(cost_data_invoice_year AS STRING), '-', CAST(current_month_date AS STRING FORMAT('MM')), '/*.csv'),
    format='JSON',
    overwrite=True) AS
SELECT
  *, (SELECT STRING_AGG(display_name, '/') FROM B.project.ancestors) organization_list
FROM
  `<Your Complete Data setDataset ID>` as B
WHERE
  B.invoice.month = CONCAT(CAST(cost_data_invoice_year AS STRING), CAST(current_month_date AS STRING FORMAT('MM')))
  AND B.cost != 0.0

On-demand Schedule Query

DECLARE
  unused STRING;
DECLARE
  current_month_date DATE DEFAULT DATE_SUB(@run_date, INTERVAL <Change the Period> MONTH);
DECLARE
  cost_data_invoice_month NUMERIC DEFAULT EXTRACT(MONTH
  FROM
    current_month_date);
DECLARE
  cost_data_invoice_year NUMERIC DEFAULT EXTRACT(YEAR
  FROM
    current_month_date);
EXPORT DATA
  OPTIONS ( uri = CONCAT('gs://<your_bucket_name>/', CAST(cost_data_invoice_year AS STRING), '-', CAST(current_month_date AS STRING FORMAT('MM')), '/*.csv'),
    format='JSON',
    overwrite=True) AS
SELECT
  *, (SELECT STRING_AGG(display_name, '/') FROM B.project.ancestors) organization_list
FROM
  `<Your Complete Data setDataset ID>` as B
WHERE
  B.invoice.month = CONCAT(CAST(cost_data_invoice_year AS STRING), CAST(current_month_date AS STRING FORMAT('MM')))
  AND B.cost != 0.0

🚧

Note:

In the Period column, you need to update the interval range from 1 to 3 and create a schedule 3 times.

GCP Billing Impact Due to Onboarding

Refer to the GCP pricing details explained in the table below.

AreaModuleResources Created Through the PlatformPricingReferences
CloudOpsActivity Configuration- Cloud Pub/Sub-Topic

- Subscription
- Sink Router
Throughput cost: Free up to 10 GB for a billing account and $40 per TB beyond that.

Storage cost: $0.27 per GB per month.

Egress cost: This depends on the egress internet traffic, and costs are consistent with VPC network rates.
For details, refer to: https://cloud.google.com/vpc/network-pricing#vpc-pricing
https://cloud.google.com/pubsub/pricing#pubsub
CloudOpsAlerts Configuration- Notification Channel

- Alert Policies
Alert Policies: Free of charge.
Monitoring:
$0.2580/MB: first 150–100,000 MB.
$0.1510/MB: next 100,000–250,000 MB.
$0.0610/MB: >250,000 MB.
https://cloud.google.com/stackdriver/pricing#monitoring-pricing-summary
SecOpsVulnerabilityNeed to enable GCP Security Command Center (Standard Tier)GCP Security Command Center (Standard Tier): Free of charge.https://cloud.google.com/security-command-center/pricing#tier-pricing
SecOpsThreat Management (Optional)Need to enable GCP Security Command Center (Premium)GCP Security Command Center (Premium):
If annual cloud spend or commit is less than $15 million, then it's 5% of committed annual cloud spend or actual annual current annualized cloud spend (whichever is higher).
https://cloud.google.com/security-command-center/pricing#tier-pricing
FinOpsBigQuery Billing ExportNeed to enable standard export and create a DatasetStorage cost for dataset: $0.02 per GB (First 10 GB is free per month).https://cloud.google.com/bigquery/pricing#storage
FinOpsScheduled Queries- Daily Scheduled Query

- Monthly Scheduled Query
- On-demand Scheduled Query
$5 per TB (First 1 TB is free per month).https://cloud.google.com/bigquery/pricing#on_demand_pricing
FinOpsTransferred Billing DataStorage Bucket$0.02–$0.023 per GB per month.https://cloud.google.com/storage/pricing#price-tables

📘

Note:

These charges are based on the GCP pricing references. Actual cost may vary based on volume and consumption.