Analyse audit logs using BigQuery

Saeed Ahmed
10 min readNov 25, 2024

--

Overview

Google Cloud services write audit logs that record administrative activities and access within the Google Cloud resources. Audit log entries help admins answer the questions “who did what, where, and when” within the Google Cloud projects. Enabling audit logs helps the admins with security, auditing, and compliance entities monitor Google Cloud data and systems for possible vulnerabilities or external data misuse.

Activate Cloud Shell

Cloud Shell offers a browser-based development and operations environment that can be accessed from anywhere. It provides command-line access to your Google Cloud resources. To activate Cloud Shell, click the “Activate Cloud Shell” icon in the upper right corner of the Google Cloud console. You might be prompted to click “Continue”. Once Cloud Shell has initialised, a message will appear indicating the Google Cloud Project ID associated with your current session.

Google Cloud’s command-line interface, `gcloud`, comes pre-installed in Cloud Shell and features tab-completion for efficient use. Before accessing Google Cloud resources, you must authorise `gcloud`. To do this, execute the command `gcloud auth list`, which will prompt a pop-up window requesting authorisation. Click “Authorize” to grant Cloud Shell the necessary permissions. Following successful authorisation, the output of the command will display your active account details.

To determine the project ID for your current Cloud Shell session, execute the command gcloud config list project. This will output the active configuration details, including the project field which displays the ID, as shown in the provided example where the project ID is qwiklabs-gcp-02-e5da0faa075a.

Task 1. Generate account activity

This task involves generating account activity by creating and deleting cloud resources, which will be recorded in Cloud Audit Logs. To achieve this, execute the following commands in the Cloud Shell terminal:

These commands create a storage bucket, upload a file, create a network and a virtual machine instance, and finally delete the storage bucket and its contents.

Task 2. Export the audit logs

The actions performed in the previous task generated audit logs which can be further analysed. To facilitate this analysis, these logs will be exported to a BigQuery dataset. Begin by navigating to the Logs Explorer page in the Google Cloud console. This can be accessed by clicking the Navigation menu, followed by “Logging” and then “Logs Explorer.” (Note: you may need to expand the Navigation menu options by clicking “More Products” to locate “Logging” under “Operations”.) It is important to remember that any active filters in the Logs Explorer will be applied to the exported data. To proceed, copy the following query into the Query builder:

After running the query and observing the results in the Query results pane, the next step is to create a sink to export these logs. Below the Query editor field, click “More actions” and then “Create sink.” (Note that in smaller browser windows, the UI may display “More” instead of “More actions.”) This will open the “Create logs routing sink” dialogue. In this dialogue, configure the following settings while retaining the default values for all other options:

Within the “Create logs routing sink” dialogue, specify “AuditLogsExport” as the sink name under “Sink details” and click “Next.” For the “Sink destination,” select “BigQuery dataset” as the sink service and create a new BigQuery dataset with the ID “auditlogs_dataset.” After creating the dataset, proceed to the next step in the “Sink destination” dialogue. Uncheck the “Use Partitioned Tables” checkbox if selected and continue to the next step. Observe the pre-filled “Build inclusion filter” in the “Choose logs to include in sink” section and proceed. Finally, click “Create Sink” to create the sink and return to the Logs Explorer page.

To confirm the successful creation of the export sink, navigate to the “Log Router” section within the “Logging” navigation pane. Locate the “AuditLogsExport” sink in the “Log Router Sinks” list. For detailed information about this sink, click the “More actions” menu (three vertical dots) inline with the sink name and select “View sink details.” This will open the “Sink details” dialogue, providing an overview of the sink’s configuration. Once reviewed, click “Cancel” to close the dialogue. Going forward, all newly generated logs will be exported to BigQuery, enabling analysis using BigQuery tools. However, it’s important to note that this export process does not include existing log entries.

Task 3. Generate more account activity

This task focuses on generating additional audit log entries by creating and then deleting cloud resources. These new logs will then be accessed and analysed within BigQuery to glean further insights. This hands-on approach reinforces the importance of audit logs in understanding and monitoring activity within a cloud environment.

To generate additional activity for analysis within the exported audit logs, execute the provided commands. During the execution, you will be prompted to confirm the deletion of resources. Enter ‘Y’ and press ENTER to proceed. This process will specifically create two new storage buckets and delete a Compute Engine instance, generating corresponding audit log entries.

After a short delay, proceed to enter the provided commands into the Cloud Shell terminal and press ENTER. This action will delete the two storage buckets that were previously created, generating corresponding audit log entries. The prompt and subsequent deletion of these resources simulate real-world cloud resource management activities, providing valuable data for analysis in the exported logs.

Task 4. Sign in as the second user

To proceed with the analysis, you’ll need to switch to a different Google Cloud account. This second account, provided in the Lab Details panel, will be used specifically for analyzing the generated logs.

In the Google Cloud console, click on the user icon located in the top-right corner of the screen. From the dropdown menu, select “Add account.” Now, navigate back to the Lab Details panel and copy the credentials for “Google Cloud username 2,” which include both the username (student-00–89f09f721ad4@qwiklabs.net) and the corresponding password. Return to the Google Cloud console and paste these credentials into the “Sign in” dialogue box to access the second account.

Task 5. Analyse the Admin Activity logs

This task involves reviewing Admin Activity logs to identify and filter potentially suspicious activities. Admin Activity logs provide crucial records of API calls and administrative actions that modify resource configurations or metadata. Examples include the creation of VM instances, changes to permissions, and App Engine application deployments. These logs can be accessed through various channels like the Logs Viewer, Cloud Logging, and the Cloud SDK. Furthermore, they can be exported to platforms like Pub/Sub, BigQuery, and Cloud Storage for further analysis.

To begin, navigate to the Logs Explorer in the Google Cloud console. Click on the Navigation menu, then select “Logging” followed by “Logs Explorer.” You might need to expand the “More Products” drop-down menu within the Navigation menu to locate “Logging” under “Operations.”

Ensure the “Show query” toggle button is activated to display the Query builder field. Now, copy and paste the provided command into this field, making sure to replace the placeholder project ID in the command with your actual Google Cloud project ID. This query will filter and display relevant Admin Activity logs, enabling the identification of potentially suspicious entries.

Within the Query results, locate the log entry that indicates a Cloud Storage bucket deletion. This entry will contain the storage.buckets.delete summary field, which highlights key information. Specifically, this entry refers to storage.googleapis.com calling the storage.buckets.delete method. The deleted bucket’s name will match your project ID: qwiklabs-gcp-02-e5da0faa075a.

To filter the results further, click on the storage.googleapis.com text within this entry and select “Show matching entries.” This action refines the Query results, displaying only six entries related to Cloud Storage bucket creation and deletion. Observe that the line protoPayload.serviceName=”storage.googleapis.com” is automatically added to the Query builder, filtering the results to entries specifically associated with storage.googleapis.com.

Next, click on storage.buckets.delete in one of the displayed entries and again select “Show matching entries.” This adds another line to the Query builder:

logName = (“projects/qwiklabs-gcp-02-e5da0faa075a/logs/cloudaudit.googleapis.com%2Factivity”)

protoPayload.serviceName=”storage.googleapis.com”

protoPayload.methodName=”storage.buckets.delete”

The Query results now show all entries related to deleted Cloud Storage buckets, demonstrating how this technique can pinpoint specific events.

To examine a specific deletion event in detail, expand a storage.buckets.delete entry by clicking the expand arrow (>) next to it. Further expand the authenticationInfo field using the expand arrow. This reveals the principalEmail field, which displays the email address of the user account responsible for the deletion. In this case, it should be the user 1 account used to generate the initial activity.

Task 6. Use BigQuery to analyse the audit logs

After generating and exporting audit logs to a BigQuery dataset, the subsequent step involves analysing these logs using BigQuery’s powerful Query editor. It is crucial to remember that Cloud Logging employs a structured approach to organise exported log entries. These entries are stored in dated tables within the designated dataset, with table names derived from the corresponding log names.

To initiate the analysis, ensure you are logged into the Google Cloud console using the designated account (`username 2: student-00–89f09f721ad4@qwiklabs.net`). Navigate to the BigQuery interface by clicking the Navigation menu and selecting the “BigQuery” option. Upon entering BigQuery, you may encounter a “Welcome to BigQuery in the Cloud Console” message box; click “Done” to proceed.

Within the Explorer pane, locate your project, identified as `qwiklabs-gcp-02-e5da0faa075a`, and expand it by clicking the adjacent arrow. This action will reveal the `auditlogs_dataset`, which houses the exported log entries ready for analysis using the Query editor.

Before diving into log analysis, it’s essential to verify that the BigQuery dataset has the necessary permissions for the export writer to store log entries correctly.

Start by clicking on the `auditlogs_dataset` dataset. In the dataset’s toolbar, click the “Sharing” dropdown menu and select “Permissions.”

This will open the “Share permission for ‘auditlogs_dataset’” page. Expand the “BigQuery Data Editor” section to confirm that the service account used for log exports is listed with appropriate permissions. The service account typically follows the format `service-xxxxxxxx@gcp-sa-logging.iam.gserviceaccount.com`. This permission is automatically assigned during log export configuration, serving as a useful confirmation.

Once verified, click “Close” to exit the “Share Dataset” window. Back in the Explorer pane, click the expander arrow next to the `auditlogs_dataset` to reveal the `cloudaudit_googleapis_com_activity` table containing the exported logs. Select this table to display its schema and details, taking a moment to review its structure and contents. This preliminary assessment provides context for the subsequent analysis within the Query editor.

To analyse the audit logs in BigQuery, begin by clearing any existing text within the “Untitled” tab of the query builder. Then, copy and paste the following command:

```sql

SELECT

timestamp,

resource.labels.instance_id,

protopayload_auditlog.authenticationInfo.principalEmail,

protopayload_auditlog.resourceName,

protopayload_auditlog.methodNameFROM

`auditlogs_dataset.cloudaudit_googleapis_com_activity_*`WHERE

PARSE_DATE(‘%Y%m%d’, _TABLE_SUFFIX) BETWEEN

DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY) AND

CURRENT_DATE()

AND resource.type = “gce_instance”

AND operation.first IS TRUE

AND protopayload_auditlog.methodName = “v1.compute.instances.delete”ORDER BY

timestamp,

resource.labels.instance_id

LIMIT

1000;

```

This query identifies users who have deleted virtual machines within the past seven days. Execute the query by clicking “Run.”

After a short processing time, BigQuery will return results showing instances of user-initiated Compute Engine virtual machine deletions within the specified timeframe. You should observe one entry corresponding to the activity performed as “user 1” in the previous tasks. Remember that BigQuery only displays activity recorded after the export was created.

Next, replace the existing query in the “Untitled” tab with the following:

```sql

SELECT

timestamp,

resource.labels.bucket_name,

protopayload_auditlog.authenticationInfo.principalEmail,

protopayload_auditlog.resourceName,

protopayload_auditlog.methodNameFROM

`auditlogs_dataset.cloudaudit_googleapis_com_activity_*`WHERE

PARSE_DATE(‘%Y%m%d’, _TABLE_SUFFIX) BETWEEN

DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY) AND

CURRENT_DATE()

AND resource.type = “gcs_bucket”

AND protopayload_auditlog.methodName = “storage.buckets.delete”ORDER BY

timestamp,

resource.labels.instance_id

LIMIT

1000;

```

This modified query retrieves users who have deleted Cloud Storage buckets within the last seven days. Upon execution, you should observe two entries corresponding to the “user 1” activity from the previous tasks.

Click “Run” to execute this query.

The capability to analyse audit logs within BigQuery offers significant advantages. This exercise demonstrates just two examples of how audit logs can be queried to extract valuable insights for security monitoring, auditing, and compliance purposes.

Conclusion

This lab highlighted the power and versatility of Google Cloud’s audit logging capabilities. By generating events and exporting logs to BigQuery, we effectively explored how to monitor and analyse administrative activities. Key takeaways include the comprehensive logging provided by Google Cloud for various services and actions, the flexibility of exporting these logs to BigQuery for in-depth analysis, and the efficiency of using BigQuery’s Query editor to filter and identify critical events. Mastering these tools empowers administrators to proactively ensure the security and compliance of their Google Cloud environments. This hands-on experience provides a solid foundation for leveraging audit logs effectively in real-world scenarios.

--

--

Saeed Ahmed
Saeed Ahmed

No responses yet