Lacework supports the following alert channels that forward Lacework alerts to your Elastic stack.
Send Lacework alerts to SQS through Amazon CloudWatch and then retrieve the alerts from SQS using a plugin from Elastic.
Create a Lacework Alert Channel for Amazon CloudWatch
Follow the steps described in Amazon CloudWatch to forward alerts from Lacework to CloudWatch.
Follow these steps to add an event source mapping for an Amazon SQS queue and send events via trigger to it. See AWS documentation for details.
- Open the Lambda console Functions page.
- Choose a function.
- Under Add triggers, choose SQS.
- Under Configure triggers, configure the event source.
- In the SQS queue field, specify the source queue.
- In the Batch size field, specify the maximum number of items to read from the queue and send to your function, in a single invocation.
- In the Enabled field, clear the checkbox to disable the event source.
- Choose Add.
- Choose Save.
Configure the Elastic Stack
See Elastic documentation to configure your Elastic stack to retrieve events from SQS with the sqs-input plugin.
Google Cloud Pub/Sub
Send Lacework alerts to Google Cloud Pub/Sub and then use Google Dataflow to send data to your Elastic stack.
Create a Lacework Alert Channel for Google Cloud Pub/Sub
Follow the steps described in Google Cloud Pub/Sub to send alerts from Lacework to Google Pub/Sub.
Follow these steps to configure a Dataflow template that sends alerts from Pub/Sub to your Elastic stack.
- Install the Elastic GCP integration from the Kibana web interface.
- In the Google Cloud Console go to the Dataflow product.
- Click Create job from template.
- Select Pub/Sub to Elasticsearch from the Dataflow template dropdown menu.
- Provide the required parameters:
- Your Cloud ID - Find the Cloud ID in the Elastic Cloud interface.
- Base64-encoded API key for Elasticsearch endpoint - Use the Create API key API to create the API key.
- Type of logs sent via Pub/Sub ... - Add audit.
- Click Run Job and wait for Dataflow to execute the template.