Streaming HCP Vault audit logs to Amazon CloudWatch for secure, real-time visibility
Learn how to automatically stream HCP Vault Dedicated audit logs into Amazon CloudWatch for real-time monitoring and compliance.
You can use HCP Vault Dedicated audit log streaming to monitor and audit secure access to sensitive services and systems within your infrastructure. In production environments, every interaction with Vault, from reading secrets to generating dynamic credentials, should be tracked to meet compliance and security requirements.
This post shows how HCP Vault Dedicated audit log streaming can be integrated with Amazon CloudWatch to forward detailed audit events in real time. With this setup, teams gain centralized visibility into Vault operations, detect unauthorized access patterns, and meet regulatory and operational audit needs all without the overhead of managing custom log forwarding pipelines.
» Why stream audit logs to Amazon CloudWatch?
Audit logs in Vault capture every request and response, including the identity making the request, the path being accessed, and the operation performed. Streaming these logs into Amazon CloudWatch helps you:
- Search and filter logs for specific operations (e.g. read, update)
- Detect suspicious behavior or access patterns
- Meet compliance requirements with long-term log storage
- Integrate with CloudWatch alarms for proactive monitoring
» Setup requirements
To set up audit log streaming from HCP Vault Dedicated to Amazon CloudWatch, you'll need:
- An AWS account with permission to create IAM users and policies
- Access to HashiCorp Cloud Platform (HCP) with the Admin or Contributor role
- An HCP Vault Dedicated cluster (production tier)
» Step 1: Create IAM policy
To allow HCP Vault Dedicated to stream audit logs to Amazon CloudWatch, you need to create an IAM policy that grants access to manage and write to CloudWatch log groups and streams.
This policy allows HCP Vault Dedicated to:
- Create log groups and streams
- Write log events
- Describe existing log resources
- Tag log groups for organization or billing
Log in to the AWS management console and navigate to the IAM dashboard.

Click Policies, then click Create policy.

Policy JSON:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "HCPLogStreaming",
"Effect": "Allow",
"Action": [
"logs:PutLogEvents",
"logs:DescribeLogStreams",
"logs:DescribeLogGroups",
"logs:CreateLogStream",
"logs:CreateLogGroup",
"logs:TagLogGroup"
],
"Resource": "*"
}
]
}
Choose the JSON tab and paste the policy above and click Next.

Enter hcp-vault-log-streaming-demo in the Policy name textbox and click Create policy.

You can now see the policy named hcp-vault-streaming-demo has been successfully created.

» Step 2: Create IAM user
After creating the IAM policy, the next step is to create a dedicated IAM user for HCP Vault Dedicated to use when streaming logs to Amazon CloudWatch.
This user will have programmatic access and be assigned the policy you created.
Recommended practice:
Instead of attaching the policy directly to the user, it's best to create an IAM group, attach the policy to the group and then add the user to that group. This makes it easier to manage permissions for multiple users in the future.
In the IAM dashboard, go to Users and click Create user.

Enter hcp-vault-log-streaming-user in the User name textbox and click Next.

Under Permissions options select "Attach policies directly" then locate the hcp-vault-log-streaming policy check the box next to it and click Next.

Review and click Create user.

You can now see the user named hcp-vault-streaming-user has been successfully created.

Select the hcp-vault-log-streaming-user user from the Users list then go to the Security credentials tab and click Create access key.

Select "Application running outside AWS" as the use case and click Next.

Click Create access key.

Now you can see that the access key has been created. Make sure to securely store both the access key ID and secret access key, as this is the only time they will be shown.

» Step 3: Create HCP Vault Dedicated cluster
To begin using audit log streaming, you’ll need an active HCP Vault Dedicated cluster with a production-grade tier.
Log in to the HashiCorp Cloud Platform (HCP), navigate to Vault Dedicated, and click on Create cluster.

Choose AWS as the provider then select the Vault tier and cluster size based on your needs. For this demo Vault tier is set to Standard and the cluster size is Small.

Choose the network region where you want to deploy the cluster. This region must match the region where your application services are running.

Enter the Cluster ID as hcp-vault-log-streaming-demo then choose "Start from scratch" and click Create cluster.

The cluster creation process will take approximately 5 to 10 minutes to complete.

In the Overview section you can view the cluster details along with quick actions to connect to the cluster.

» Step 4: Enable audit log streaming
Click on the Vault cluster, then navigate to the Audit logs section under Data streaming.

Click Enable log streaming.

Select Amazon CloudWatch as the provider and click Next to continue.

Enter the access key ID and secret access key that you created earlier, then choose the region where you want to store the CloudWatch log data and click Save to enable audit log streaming.

It may take around 10 to 20 minutes for the configuration to complete and for logs to start appearing in CloudWatch.

Now the audit log stream has been created and you can see the log group and stream name.

» Step 5: Verify logs in CloudWatch
Log in to the AWS management console, then go to CloudWatch and select Log groups from the navigation pane.

Click on the log group to view its details, then click the log stream named hcp-vault-hcp-vault-log-streaming-demo to see the audit logs.

Now audit log events will begin streaming from the HCP Vault Dedicated cluster to CloudWatch in real time.

» Querying and monitoring logs
Navigate to Log insights in CloudWatch, select the log group, paste the query below to filter the events, and click Run query. In this example, we're filtering for update operations:
fields @timestamp, request.operation, request.path, auth.display_name
| filter request.operation = "update"
| sort @timestamp desc
| limit 20

» Final notes
Integrating HCP Vault Dedicated with Amazon CloudWatch for audit log streaming is a powerful way to improve visibility, security, and compliance in your infrastructure. By following a few simple steps, you can stream real-time audit logs directly into CloudWatch and monitor Vault activity using native AWS tools.
This setup allows you to:
- Track and analyze Vault operations
- Detect unusual or unauthorized access
- Simplify audit and compliance reporting
Whether you're running production workloads or experimenting in a demo environment, enabling audit log streaming ensures that every critical Vault action is recorded and accessible when needed.
Need more help building a broader compliance and auditing strategy? Learn more about HashiCorp’s audit and compliance solutions on this page.
Sign up for the latest HashiCorp news
More blog posts like this one

Anonymize RAG data in IBM Granite and Ollama using HCP Vault
Learn how to configure tokenization and masking with HCP Vault's transform secrets engine for data and pass it to IBM Granite, Ollama, and Open WebUI for RAG.

HashiCorp Vault and FIPS 140-3: Strengthening security and compliance
HashiCorp Vault now supports FIPS 140-3, the latest NIST standard for cryptographic modules.

Vault Radar, Boundary transparent sessions, and more at HashiDays 2025
New Security Lifecycle Management (SLM) features from HashiCorp Vault, Boundary, and Consul help organizations remediate and prevent secrets exposures, improve developer access experience, and improve service discovery.