Skip to content

Sinking logs to BigQuery

GCP, bigquery1 min read

I wanted to know how many times cloud functions are getting triggered in my project.

  1. Go to Google Cloud Console and open the Logs explorer

  2. Click Show query. By default it showed the below logging query


Since i was to transfer all the logs related to Cloud Functions, i keep only the below condition

  1. Click More Actions or Actions and select Create Sink

Create Sink

  1. It takes to Logging -> Logs Router. Here i fill following information

Create BigQuery Dataset

Create Sink

  1. Click Create Sink

  2. Go to bigQuery and execute a sample query like below to know how many times Cloud Functions were triggered in a day

1select count(resource.labels.function_name) as invocation_total,
4from `<<project-name>>.gcp_logging.cloudfunctions_googleapis_com_cloud_functions_20220423`
5where textPayload = 'Function execution started'
6group by resource.labels.region, resource.labels.function_name

Note: There will be one new table each day.