S3 Log Destination
Common Fate can send audit log events to an Amazon S3 bucket. You can use this to keep an archive of actions taken within Common Fate, or to build your own reports based on access activity within Common Fate.
Logs are written in batches in JSONL format and can optionally be compressed.
Prerequisites
To configure a webhook integration you’ll need to use version 2.18 or later of our Terraform Provider.
If you’re running a BYOC (“Bring-Your-Own-Cloud”) deployment of Common Fate in your own AWS account, you’ll need to be on v1.43.1
or later of the common-fate/common-fate-deployment/aws
Terraform module.
Additionally, you’ll need to deploy an S3 bucket and an AWS IAM role for Common Fate to assume in order to write events to the S3 bucket. The role must meet the following requirements:
- The role should have a policy allowing the
s3:GetBucketLocation
action on the bucket, ands3:PutObject
on the bucket contents. - The role must have the tag
common-fate-allow-assume-role
with valuetrue
. - The role must have a trust relationship allowing the Common Fate AWS account to assume the role. We recommend setting an External ID condition on the trust relationship to match the
assume_role_external_id
parameter in your Common Fate deployment.
You can use our audit-log-bucket
Terraform module as a starting point to provision the required S3 bucket and IAM role:
Setting up
You can configure an S3 log destination by adding a Terraform resource similar to the below to your Common Fate application configuration:
For details on the available variables in the commonfate_s3_log_destination
resource, refer to our Terraform provider documentation.