Collecting Cloudtrail Logs
This page contains instructions required to be fulfilled, to enable BluSapphire Log-Agent to collect log data from Amazon S3 (Amazon Simple Storage Service).
Last updated
This page contains instructions required to be fulfilled, to enable BluSapphire Log-Agent to collect log data from Amazon S3 (Amazon Simple Storage Service).
Last updated
This document leverages BluSapphire Log-Agent and Amazon S3 input to collect log files from S3 buckets with SQS notifications from AWS services that store logs to Amazon S3: including but not limited to VPC flow logs, AWS CloudTrail logs, Elastic Load Balancer access logs.
BluSapphire Log-Agent can retrieve events from files stored in an S3 bucket and ship them to the BluSapphire data lake, this involves the use of Amazon Simple Queue Service (SQS) for Amazon S3 notification when a new S3 object is created. S3 input within the agent checks SQS for new messages regarding the new object created in S3 and uses the information in these messages to retrieve logs from S3 buckets. S3 input guarantees near real-time data collection from S3 buckets with both speed and reliability.
AWS account with sufficient access to create resources in “Amazon S3, Amazon SQS, Amazon IAM”.
Amazon SQS Queue must be in the same region as the AWS S3 bucket that the queue is collecting from.
Already have configured AWS Services to send logs to S3 Bucket and the logs are being written to their respective S3, if not please follow the documentation to configure Amazon S3 as a logging destination:
AWS CloudTrail records activities that occur in your AWS account as events, including actions taken in the AWS Management Console, AWS Command Line Interface, and AWS SDKs and APIs by a user, role, or an AWS service. Amazon CloudTrail uses Amazon S3 to store logs by default, see instructions on creating a trail to enable CloudTrail to deliver log files to a new or existing Amazon S3 bucket.
VPC Flow Logs is a feature that enables you to capture information about the IP traffic going to and from network interfaces in your VPC. Flow log data can be published to Amazon CloudWatch Logs or Amazon S3. For instructions to publish VPC flow log data to Amazon S3.
Amazon S3 buckets will be created automatically while configuring respective AWS services as mentioned in the perquisites section of this page (AWS CloudTrail, VPC Flow logs, Other services).
If you want to ship log data from other custom S3 buckets, you may have to create them manually as per your requirement.
1) From AWS Management Console - go to Simple Queue Service (SQS) management console via services menu (or) Open the Amazon SQS console at https://console.aws.amazon.com/sqs/.
2) Specify the correct region from the upper right of the window. Note: SQS Queue must be created in the same region as the AWS S3 bucket.
3) Choose to Create a queue, On the Create queue page - select queue type "Standard" (Queue type can't be changed after it's been created) and Enter a Name for the queue.
4) Change the Visibility timeout queue parameter to '60 Seconds', and leave the rest to their defaults as below.
5) Scroll to the bottom and choose Create Queue. Amazon SQS creates the queue and displays the queue's Details page. Note: It may take a while before the queue is displayed on the Queues page.
1) Navigate to SQS Management Console, and select the queue from the list that you created earlier for Notifications.
2) From the Queue's properties window, Copy the ARN field value, Example: arn:aws:sqs:us-east-1:123456789012:MySQSQueueName
3) Move to the "Access Policy" tab inside Queue's properties window then click edit to configure the SQS queue access policy permissions.
4) Scroll down to the “Access policy” section, copy-paste the following JSON policy content into the Edit Policy Document window as shown, and replace the "SQS-Queue-ARN, Buket-Name" with appropriate values.
5) Continue to review the policy, ensure that the data is correct, and then click Save Changes.
1) Navigate to S3 Management Console, select the S3 bucket created/generated for the AWS CloudTrail service and move to the properties tab.
2) Scroll down to the “Event Notifications” section, and click on "Create event notification" to configure parameters for the new event.
3) Provide an Event name and add prefix/suffix for the path's you want to collect data from.
4) Choose “All object create events” to add a notification requesting Amazon S3 to publish events of the s3:ObjectCreated:* type to the Amazon SQS queue.
5) Select Destination as SQS Queue and choose a specific SQS Queue either from SQS queues or provide the respective SQS queue ARN (available in details of the Queue).
6) To view available messages for a specific Queue, select the queue and click on “send and receive messages”.
7) scroll down to the “Receive messages” section, and use “Poll for messages” to get the available message records.
Note: Configuration steps performed and documented here are for the Amazon CloudTrail service, for other AWS services involving AWS S3 as storage for log data, please complete “Step 01” and then repeat “Step 02: Create Amazon SQS Queue, Step 03: Configure permissions for SQS queue, Step 04: Creating AWS S3 Event Notification” for respective AWS services like VPC flow logs and others.
An AWS IAM policy is an entity that defines permissions to an object within your AWS environment. Creating a customized IAM policy for BluSapphire Log Shipper with the required set of permissions is needed for API calls to access and process log data from AWS S3.
1) To create a new IAM policy, from AWS Management Console - go to the IAM management console via the services menu -> navigate to “Policies” under “Access Management” from the left side menu and click on “create policy”.
2) Switch to the JSON tab of the policy visual editor, copy-paste the above policy JSON as shown below, replace “<AWS_SQS-ARN>” in JSON code with ARN of the respective SQS Queue created earlier) and continue to the next step.
3) Add the required add tag and continue. In the review screen, provide the policy name, and description, and click on create policy.
It's recommended to create a new user with the required permission and policies for collecting logs from AWS.
1) To create a new IAM user, from AWS Management Console - go to the IAM management console via the services menu -> navigate to “Users” under “Access Management” from the left side menu, and click on “Add users".
2) Provide the appropriate username, and select “Access key” for AWS credential type.
3) Choose to attach existing policies directly, search and select the IAM policy that was created earlier, and continue.
4) Review the user details and permissions for the new user and click create user, download the CSV file and make note of “Access key ID” and “Secret access key” as these are needed while configuring the BluSapphire log agent.
1) To create a new IAM role, from AWS Management Console - go to the IAM management console via the services menu -> navigate to “Roles” under “Access Management” from the left side menu and click on “Create role".
2) Choose “Custom trust policy” for the select trusted entity in step 1, copy-paste the below-provided policy JSON, and continue.
3) Next Add permissions, search for a previously created policy and select it for permissions and continue.
4) Provide role name, and appropriate description, review permissions for the new role and continue.
The following information is required to configure the BluSapphire agent for collecting and processing log data from AWS S3:
Required AWS Information | Description |
---|---|
Access key ID | AWS access key of the IAM user created earlier |
Secret access key | AWS access secret of the IAM user created earlier |
Role ARN (role_arn) | ARN of the role to assume - can be obtained from the summary section of the role created |
Queue URL (queue_url) | URL of the AWS SQS queue that messages will be received from. Note: As the SQS queue for different services varies, can be found in the details section of the respective Queue. |
Note: Having all the above configuration steps performed, and the BluSapphire agent is configured, it may take some time to get the logs populated in the BluSapphire portal.