Skip to main content

Storage-Based Google Cloud Audit Log Integration - Manual Configuration

Important

Starting from September 25, 2023, you cannot create a new Storage-based audit log integration. Lacework recommends that you do the following:

This topic describes how to manually create a Lacework Google Cloud Storage-based audit log integration to gather, process, report, and alert on Google Cloud Cloud Audit Log data.

note

You must create a separate Google Cloud integration for reporting and alerting on Google Cloud configuration compliance. For more information, see Google Cloud Configuration Integration - Manual Configuration. You can, however, use the same credentials for both integrations.

When a Google Cloud Audit Log integration is created, Lacework gathers the logs of administrative operations from Google Cloud audit logs. Google Cloud captures detailed audit log data and stores that data in audit logs. Lacework aggregates and organizes this audit log data into useful maps and dashboards that illustrate the following. Lacework also automatically generates alerts whenever an audit log event represents a security risk.

  • conceptual relationships
  • causes and effects
  • interactions between Google Cloud entities

Lacework uses Cloud Logging (formerly Stackdriver) to get the Google Cloud audit logs.

Requirements

  • Decide if you want to monitor the audit logs at the organization, folder, or project level. If you specify the option to include children when creating the sink at the organization or folder level, all the projects in the organization or the folder export the audit logs to the bucket.
  • If you are already exporting your audit logs to a storage bucket, you can reuse that storage bucket but the project with the storage bucket must be granted role access as described in the procedure below.
  • The procedure below requires that the jq utility is installed. The jq utility is a flexible command-line JSON processor. For more information, see https://stedolan.github.io/jq/.
  • Ensure that you are deploying the integration to a supported Google Cloud region.

Create a Google Cloud Service Account and Grant Access

Follow the procedure in Create a Google Cloud Service Account and Grant Access. You can skip this procedure if you have already created a service account and granted role access when creating a new Google Cloud configuration integration. You can also reuse the same service account.

Enable the Required Google Cloud APIs

When manually creating a Google Cloud Audit Log integration, you must enable APIs for the Google Cloud projects you want to integrate. Follow the procedure in Enable the Required Google Cloud APIs.

Configure Google Cloud for Audit Log Integration

This procedure configures the Google Cloud Audit Log integration to ingest audit log data.

  1. Verify that the jq (command-line JSON processor) utility is available from your command-line shell. Open a command-line window and enter the following command. If the jq utility is found, skip to step 3.

    jq
  2. If the jq utility is not installed or not listed in your PATH, install it (https://stedolan.github.io/jq/) and verify that the path to the utility is listed in your PATH environment variable. The jq utility is required for some of the steps in the following procedure.

  3. If you do not have the Google Cloud CLI installed on your system, install it. For more information, see CLOUD SDK. As part of the installation process, ensure you initialize gcloud with the following command: gcloud init. Leave the CLI window open.

  4. Log in to the Google Cloud Console.

  5. Decide if you want to export the audit logs at the organization, folder, or project level. In addition, pick an entity to export audit logs from. An entity is either an organization, folder, or project. This entity must be granted access to the service account and must have the role access as described in Create a Google Cloud Service Account and Grant Access. A project can have access because it was directly granted access or because its parent organization was granted access. Billing must also be enabled for any projects that you want to export audit logs from. Note that if you select the option to include children when creating the sink at the organization or folder level, all projects in the organization or the folder export the audit logs to the bucket.

  6. Locate and get the ID of the Google Cloud project where the bucket and sink will be created as described in steps 7 - 13. In addition, if you decide to export at the project level, this is the project that will export its audit logs.

  7. Click the down arrow in the top menu bar.

    gcp_top_menubar.png
    The Select from dialog appears.

  8. From the Select from drop-down, select an organization that contains the Google Cloud project to monitor audit logs. gcp_select_org.png

  9. Select the All tab.

    gcp_all_tab.png

  10. Locate the Google Cloud project.

  11. From the ID column, copy the Project ID.

  12. In the CLI window, enter the following text:

    projectName=
  13. Paste the ID of the project that will monitor audit logs and enter return. Replace my-first-project-654321 with your project ID.

    projectName=my-first-project-654321
  14. Set the current project by entering the following command. If Updated property [core/project] is not returned, make sure the quotes are straight double-quotes. Use straight quotes for all the CLI commands in this procedure.

    gcloud config set project "$projectName"
  15. Create a storage bucket in the project. Replace mybucketname with your bucket name.

    bucketName=mybucketname
    gsutil mb "gs://$bucketName"
  16. Set the name of sink. Replace mysink with your sink name.

    sinkName=mysink
  17. Create the sink. Decide if you want to export the audit logs at the organization, folder, or project level and create the sink. If creating the sink at the organization level, add \ --include-children --organization=myorganizationid to the end of the following command. If creating the sink at the folder level, add \ --include-children --folder=myfolderid to the end of the following command. If you specify the \ --include-children option when creating the sink at the organization or folder level, all the projects in the organization or the folder export audit logs to the bucket.

    gcloud logging sinks create "$sinkName" "storage.googleapis.com/$bucketName" --log-filter '(protoPayload.@type=type.googleapis.com/google.cloud.audit.AuditLog) AND NOT (protoPayload.serviceName="k8s.io") AND NOT (protoPayload.serviceName:"login.googleapis.com") AND NOT (protoPayload.methodName:"storage.objects")'
  18. Copy the service account returned by the previous command. Copy the text between the single quotes. In the following example, the text to copy is italicized. The next two steps grants the objectViewer role on the bucket. In a later step, you grant Storage Object Creator role on the bucket.

    Please remember to grant `serviceAccount:123567@gcp-sa-logging.iam.gserviceaccount.com` the Storage Object Creator role on the bucket.
  19. Enter sinkServiceAccount=, paste the text and enter return.

    sinkServiceAccount=serviceAccount:123567@gcp-sa-logging.iam.gserviceaccount.com
  20. Grant bucket read permissions to the service account returned by the preceding sink create command.

    gsutil iam ch "$sinkServiceAccount:objectViewer,objectCreator" "gs://$bucketName"
  21. Create the pubsub topic under the project for Audit log file notifications. Replace mytopic with your topic name.

    topicName=mytopic
    gcloud config set project "$projectName"
    gcloud pubsub topics create "$topicName"
  22. If this is a new project, you must grant the Google Cloud-managed service account sufficient privileges to publish to the log integration pubsub topic. Get the name of the Google Cloud-managed service account that is created when a new project is created. This service account is in the format: service-YourProjectNumber@gs-project-accounts.iam.gserviceaccount.com.

    gcloud pubsub topics add-iam-policy-binding "projects/$projectName/topics/$topicName" --member=serviceAccount:service-$(gcloud projects describe $projectName --format='value(projectNumber)')@gs-project-accounts.iam.gserviceaccount.com --role=roles/pubsub.publisher
  23. Configure the storage bucket to send notifications to pubsub. This also configures the topic policy to allow publishing from the bucket.

    gsutil notification create -e OBJECT_FINALIZE -f json -t "projects/$projectName/topics/$topicName" "gs://$bucketName"
  24. Create the pubsub subscription (queue) for the audit log file notification topic created in the previous step. Replace mytopicsub with your topic subscription name.

    topicSubscriptionName=mytopicsub
    gcloud pubsub subscriptions create "$topicSubscriptionName" --topic "$topicName" --ack-deadline=300
  25. Grant bucket read permissions to the service account that you granted role access in Create a Google Cloud Service Account and Grant Access. This is not the service account returned by the sink create command above. Replace the service account name with the fully qualified service account name. Note that the string serviceAccount: must preface the service account name.

    serviceAccount=serviceAccount:myserviceaccount@myloggingproject.iam.gserviceaccount.com
    gsutil iam ch "$serviceAccount:objectViewer" "gs://$bucketName"
  26. Grant the integration service account the subscriber role on the subscription. If prompted to install a command group, answer y.

    gcloud pubsub subscriptions add-iam-policy-binding "projects/$projectName/subscriptions/$topicSubscriptionName" --member="$serviceAccount" --role=roles/pubsub.subscriber
  27. Enter the following command to get the subscription path that is used when creating the integration in the Lacework Console.

    echo "projects/$projectName/subscriptions/$topicSubscriptionName"
  28. Copy the resulting subscription path returned by the echo command for use in the next procedure that creates the integration.

note

If you are setting up new Google audit logging (instead of leveraging existing Google audit logging), Lacework recommends that you set a retention policy with a minimum of 7 days.

Create the Google Cloud Audit Log Integration on the Lacework Console

  1. Log in to the Lacework Console.
  2. Go to Settings > Integrations > Cloud accounts.
  3. Click + Add New.
  4. Click Google Cloud Platform and select Manual configuration.
  5. Click Next.
  6. Select Audit Log (Storage) and follow the steps in Create an Audit Log Integration.

Create an Audit Log Integration

  1. Create a Google Cloud service account and grant access.
  2. Enable the required Google Cloud APIs.
  3. Configure Google Cloud for Audit Log integration.
  4. For Name, enter a unique name that displays in the Lacework Console.
  5. Follow the steps to either upload GCP credentials or enter information manually.

When creating the Google Cloud integration, you can either upload Google Cloud credentials or enter all information manually. Finish creating the integration in the Lacework Console by following the steps described in one of the following sections.

Upload Google Cloud Credentials

To upload Google Cloud credentials, follow these steps:

  1. For Upload GCP Credential, click Choose File and navigate to the JSON key file downloaded when you created the Google Cloud service account.
    This populates the credential fields.

  2. For Integration Level, select Organization or Project. Select Organization if integrating at the organization level. Select Project if integrating at the project level.

  3. For Org/Project ID, paste in the appropriate ID value for your integration type:

    • If integrating at the project level, copy and paste the value of the project_id property from the JSON file.

    • If integrating at the organization level, log in to the Google Cloud console. Click the down arrow in the top menu bar. From the Select from drop-down, select an organization that contains the Google Cloud project(s) that you want the integration to monitor. Select IAM & admin > Settings and copy and paste the number from the Organization ID field.

      gcp_select_org.png

  4. For Subscription Name, paste the subscription path that was copied in an earlier step. The subscription path is in the following format:

    projects/$projectName/subscriptions/$topicSubscriptionName
  5. Click Save to finish the Google Cloud integration and save your onboarding progress. The integration appears in the list of cloud accounts under Cloud accounts.

For the “Integration Pending” status, hover over the status text and click the refresh icon to fetch the status result again. This does not retest the integration.

Enter Information Manually

To manually enter Google Cloud credentials when adding a Google Cloud Audit Log, follow these steps:

  1. Verify that the jq (command-line JSON processor) utility is available from your command-line shell. Leave this command-line window open.

    jq
  2. If the jq utility is found, skip to the next step. If the jq utility is not installed or not listed in your PATH, install it (https://stedolan.github.io/jq/) and verify that the path to the utility is listed in your PATH environment variable. The jq utility is required for some steps in the following procedure.

  3. Locate the JSON file you downloaded when you created the Google Cloud service account key.

  4. Open the file in an editor and leave it open.

  5. Copy the value of the client_id property from the JSON file and paste the value into the Client ID field of the Lacework Console.

  6. Copy the value of the private_key_id property from the JSON file and paste the value into the Private Key ID field of the Lacework Console.

  7. Copy the value of the client_email property from the JSON file and paste the value into the Client Email field of the Lacework Console.

  8. Exit the editor.

  9. You cannot just copy the private key from the editor because of an issue with copying the new line characters. You must copy a raw version of the key using the “jq” utility as described in the next steps.

  10. To view the private key raw text, enter the following command, where YourFileName.json is the name of the JSON file you downloaded when you created the Google Cloud service account key.

    cat YourFileName.json  | jq -r '.private_key'
  11. Copy all text displayed in the output including the BEGIN and END lines.

    -----BEGIN PRIVATE KEY-----
    YourKeyInfo
    -----END PRIVATE KEY-----
  12. Paste the text into the Private Key field of the Lacework Console.

  13. From the Integration Level drop-down, select Organization or Project. Select Organization if integrating at the organization level. Select Project if integrating at the project level.

  14. Copy the appropriate value for your integration type as described in the next two steps.

  15. If integrating at the project level, copy the value of the project_id property from the JSON file into the Org/Project ID field of the Lacework Console.

  16. If integrating at the organization level, log in to the Google Cloud console. Click the down arrow in the top menu bar. From the Select from drop-down, select an organization that contains the Google Cloud project(s) that you want the integration to monitor. Select IAM & admin > Settings and copy the number from the Organization ID field and paste the value into the Org/Project ID field of the Lacework Console.

    gcp_select_org.png

  17. Paste the subscription path that was copied in an earlier step into the Subscription Name field. The subscription path is in the following format:

    projects/$projectName/subscriptions/$topicSubscriptionName
  18. Click Save to finish the Google Cloud integration and save your onboarding progress. The integration appears in the list of cloud accounts under Cloud accounts.

For the “Integration Pending” status, hover over the status text and click the refresh icon to fetch the status result again. This does not retest the integration.