- 21 Mar 2025
- 1 Minute to read
- PDF
Integrate Your AWS S3 Data Source with the Security Data Lake
- Updated on 21 Mar 2025
- 1 Minute to read
- PDF
Any external data source that can be configured to write logs to an AWS S3 bucket can forward data to the Red Canary Security Data Lake. Data forwarded in this way is storable and exportable from the Security Data Lake, but cannot be queried via the Search page.
By integrating your security logs with the Red Canary Security Data Lake, you can meet data retention requirements, export logs when needed for investigation or reporting, and ensure greater visibility into your security infrastructure for your team and Red Canary. To integrate an external data source with Red Canary through AWS S3, follow the procedure below from beginning to end.
Step 1: Red Canary–Add a new data lake integration
From your Red Canary dashboard navigate to Integrations, click the split button to the right of Add Integration, and click Add Data Lake Integration.
Next to Add Integration, enter a name for your integration.
Choose how Red Canary will receive this data:
Under Ingest Format / Method, select Data Source via S3 (Security Data Lake).
Click the Next button.
Configure Red Canary to retrieve data from this integration:
Click the Provision button.
This will save and activate your integration. If successful, you should get a “User provisioned successfully” notification.
Under Set up log forwarding in your external log source, there will be an S3 Folder URL, AWS Access ID, and AWS Secret Key that you can use to send data to Red Canary. Copy and save these values. You will use them in a later step.
The AWS Access ID and AWS Secret Key can take time to generate. If they say “(pending…)”, wait 10 minutes and reload the page.
Click the Next button.
Customize how data from this integration is handled:
Specify your desired data retention period in days.
Click Save in the bottom right corner.
Step 2: External Data Source–Configure log forwarding
From your external data source, set up log forwarding using the S3 Folder URL, AWS Access ID, and AWS Secret Key values noted in the previous section.
Ensure that the data source is configured to emit logs in a line-delimited format.
Examples of line-delimited file formats: newline separated JSON, CSV, TSV, CEF, CLF, etc.