Automated export of CloudWatch logs to S3 / Blogs / Perficient (2023)

Written by Gerald Frilot. Posted by Tony Harper.

AWS CloudWatch is a unified monitoring service for AWS services and your cloud applications. UsingAWS Cloud Watch, you can:

  • monitor your AWS account and resources
  • generate a stream of events
  • trigger alarms and actions for specific conditions
  • Manually export CloudWatch log groups to an Amazon S3 Bucket

Exporting data to an S3 bucket is an important process if your organization needs to report CloudWatch data beyond the specified retention time. When the retention time expires, the log groups are permanently deleted. Manual export in this case reduces the risks associated with data loss, but a major drawback of manually exporting log files as defined inAWS Documents, is that each AWS account can only support one export job at a time. This action is possible if you only want to export a few log groups, but it can become very time consuming and error-prone if you frequently need to manually export more than 100 log groups.

Let's use a step-by-step solution to automate the process of exporting large groups of log files to an S3 bucket using a Lambda instance to generate traffic based on CloudWatch events. You can use an existing S3 bucket orcreate a new instance of S3.

Amazon Simple Storage Service (S3)

Log in to your AWS account, search for the Amazon S3 service and follow these steps to enable the simple storage service:

  1. Choose a meaningful name
  2. Select an AWS region
  3. keep all standards
    1. Disabled ACLs (recommended)
    2. Block all public access (disabled)
    3. Bucket version (disable)
    4. Default encryption (disable)
  • to electcreate reach(This creates a new S3 data store instance)

Automated export of CloudWatch logs to S3 / Blogs / Perficient (1)

Automated export of CloudWatch logs to S3 / Blogs / Perficient (2)

Once the bucket is created, navigate to the Entitlements tab:

Automated export of CloudWatch logs to S3 / Blogs / Perficient (3)

updatebucket policywhich allows CloudWatch to store objects in the S3 bucket. Use the following to complete this process:

{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Instructor": {
"Service": "logs.UW-REGIO.amazonaws.com"
},
"Action": "s3:GetBucketAcl",
"Recurso": "arn:aws:s3:::BUCKET_NAME_HERE"
},
{
"Effect": "Allow",
"Instructor": {
"Service": "logs.UW-REGIO.amazonaws.com"
},
"Action": "s3:PutObject",
"Recurso": "arn:aws:s3:::BUCKET_NAME_HERE/*",
"disease": {
"StringEquals":{
"s3:x-amz-acl": "bak has full control"
}
}
}
]
}

AWS Lambda

The S3 bucket is now configured to allow write objects from our CloudWatch service. Our next step is to create a Lambda instance that hosts the source code to receive events from CloudWatch and store them in our S3 instance.

Search for the Lambda service in your AWS account, navigate to the features and selectcreate role.

Automated export of CloudWatch logs to S3 / Blogs / Perficient (4)

Follow these steps:

  1. to electZero model author

Automated export of CloudWatch logs to S3 / Blogs / Perficient (5)

  1. Under Basic Information we need to provide:
    1. roll name
    2. Duration (Python 3.9)
    3. Instruction Set Architecture (x86_64standard)

Automated export of CloudWatch logs to S3 / Blogs / Perficient (6)

  1. Keep the default settings in the Driver role and advanced settings drop-down menu and select Create role

Automated export of CloudWatch logs to S3 / Blogs / Perficient (7)

Python-script (pseudocode)

The Python script imports the boto3 aws-sdk module to create, configure, and manage AWS services along with an OS and time module. We instantiate a new instance of CloudWatch Logs and a new instance of AWS Systems Manager Parameter Store. Inside the lambda handler method, we initialize an empty object and two empty arrays. The empty object can be useful if we just want to target a specific log group name prefix.

Our first array is for all log groups and the second array is used to determine which log groups to export. Next, we check if the S3 bucket environment variable exists, otherwise we return an error. Otherwise, we enter a series of loops. The first loop calls the AWS DescribeLogGroups method and adds it to our first set of log groups. Once all the log groups are added, we'll start our second loop that looks for the ExportToS3 tag in the initial log group array. If this tag exists, we update the second array with the groups of logs to export.

The last loop iterates over the second array and uses the name of the log group as a prefix for searching the parameter store. If a match is found, we check the stored time value and compare it to our current time. When 15 minutes have passed, we update the S3 bucket with our data and then update the parameter store value with the current time.

  1. Select Deploy to save our code changes and then navigate to the Configuration tab

Automated export of CloudWatch logs to S3 / Blogs / Perficient (8)

  1. Now we need to create an environment variable that points to the S3 bucket where our CloudWatch events are stored

Automated export of CloudWatch logs to S3 / Blogs / Perficient (9)

Observation: The key must be set to S3_BUCKET and the value must be set to your S3 bucket name. This is referenced in the lambda code and must be set before enabling this feature.

  1. Our next action is to update the lambda-run base function. This allows our lambda to perform read/update operations on individual AWS services. Use the following to complete the process:

{

"Version": "2012-10-17",

"Statement": [

Avoid contact center outages: schedule your upgrade to Amazon Connect

Learn the six most common pitfalls when upgrading your call center and how Amazon Connect can help you avoid them.

get the guide

{

"Sid": "Editor Visual0",

"Effect": "Allow",

"Therapy": [

"logs:ListTagsLogGroup",

"logs:DescribeLogGroups",

"logs:CreateLogGroup",

"logs:CreateExportTask",

"ssm:getparameter",

"ssm:PutParameter"

],

"Resource": "arn:aws:logs:{your region}:{your aws account number}:*"

},

{

"Sid": "Editor Visual1",

"Effect": "Allow",

"Therapy": [

"logs:ListTagsLogGroup",

"logs:MaakLogStream",

"logs:DescribeLogGroups",

"logs:PutLogEvents",

"logs:CreateExportTask",

"ssm:getparameter",

"ssm:PutParameter",

"s3:PutObject",

"s3:PutObjectAcl"

],

"Resource": "arn:aws:logs:{uw regio}:{uw aws-accountnummer}:log-group:/aws/lambda/{ Rolnaam}:*"

},

{

"Sid": "Editor Visual2",

"Effect": "Allow",

"Action": "ssm:DescribeParameters",

"Bron": "*"

},

{

"Sid": "Editor Visual3",

"Effect": "Allow",

"Therapy": [

"ssm:getparameter",

"ssm:PutParameter"

],

"Resource": "arn:aws:ssm:{ your region}:{aws account number}:parameter/log-exporter-*"

},

{

"Sid": "Editor Visual4",

"Effect": "Allow",

"Therapy": [

"s3:PutObject",

"s3:PutObjectAcl",

"s3:ObterObjeto",

"s3:GetObjectAcl",

"s3:Delete Object"

],

"Bron": [

"arn:aws:s3:::{aws bucketnaam}",

"arn:aws:s3:::{name of aws block}/*"

]

}

]

}

AWS-parameteropslag

Now that the S3 bucket and Lambda are fully configured, we can turn to the called AWS serviceParameterachterstandwhich provides secure, hierarchical storage for managing configuration data and managing secrets. This service is for reference only, as our lambda method takes care of the initial setup and naming conventions for this service. When a CloudWatch event is triggered, our code references the Parameter Store to determine if 15 minutes have passed since we last stored data in our S3 bucket. The first call sets the parameter store value to 0 and then checks/updates this value with every 15 minute thresholds recurring event. Data is never overwritten and our initial setup runs smoothly with no user intervention.

Lambda-triggers

Let's go back to our Lambda instance and do a final update toConfiguration > Triggersaba

Automated export of CloudWatch logs to S3 / Blogs / Perficient (11)

  1. to electadd trigger
  2. Fill in the fields below and make your choiceAdd
    1. CloudWatch logs (click the cursor to select the drop-down menu and select the appropriate service)
    2. recording group
    3. filter name

Automated export of CloudWatch logs to S3 / Blogs / Perficient (12)

Automated export of CloudWatch logs to S3 / Blogs / Perficient (13)

  1. Repeat steps 1 and 2 for each group of log files required for S3 storage.

Observation:The previous step and the next step are performed in this order to avoid writing data to the S3 bucket for a live environment.

Highlight from CloudWatch

Our code only exports groups of logs that contain a tag, and this operation can only be performed from a terminal. Referring toAWS CLIfor more information on configuring command line access (CLI) for your AWS environment. Once the command line access is complete, we can configure each group of log files to be exported via the command line. Use the following command to complete this process:

aws --region us-west-2 logs tag-log-group --log-group-name /api/aws/connect --tags ExportToS3=true

We are now automatically configured to export CloudWatch log groups to our S3 bucket!

AWS solution delivered

We chooseAWS servicesfor its flexibility and ability to deliver results to market in a timely manner. By focusing our attention on the AWS Cloud, we were able to efficiently export data to an event-driven S3 bucket from CloudWatch.

Contact us

They're done, we're ehAPN Advanced Consulting Partner voor Amazon Connectgiving us a unique set of skills to accelerate your cloud, agent and customer experience.

Perficient prides itself on our personalized approach to the customer journey, helping enterprise customers transform and modernize their contact center and CRM experience with platforms like Amazon Connect.For more information on how Perficient can help you get the most out of Amazon Lex,contact us here.

FAQs

How do I automatically export from CloudWatch Logs to S3? ›

Open the lambda console, create a lambda function with environment variables and required custom parameter values. Open the eventbridge console, create a rule for target as lambda function to run every 5 min to export the cloudwatch logs to s3 bucket.

How do I export CloudWatch metrics to S3? ›

How to export logs to S3
  1. Create an S3 bucket using the code below. ...
  2. Set up access policies and permissions for the S3 bucket; by default, all buckets are private. ...
  3. Once the policy is created, set the policy on the S3 bucket: aws s3api put-bucket-policy --bucket techtarget-bucket-92 --policy file://policy.json.
Jun 27, 2022

How do I download full Logs from CloudWatch? ›

-In the left navigation pane, choose Logs. -Select the log group that contains the log stream you want to download. -Select the log stream that contains the log events you want to download. -Choose the log events that you want to download.

How do I extract data from CloudWatch Logs? ›

How do I retrieve log data from CloudWatch Logs?
  1. Use subscription filters to stream log data to another receiving source in real time.
  2. Run a query with CloudWatch Logs Insights.
  3. Export log data to Amazon Simple Storage Service (Amazon S3) for batch use cases.
  4. Call GetLogEvents or FilterLogEvents in the CloudWatch API.

How do I sync CloudWatch Logs to my S3? ›

Exporting logs of Log Groups to an S3 bucket
  1. Env variable S3_BUCKET needs to be set. ...
  2. Creates a Cloudwatch Logs Export Task.
  3. It only exports logs from Log Groups that have a tag ExportToS3=true.
  4. It will use the log group name as the prefix folder when exporting.
Apr 13, 2020

How do I export CloudWatch Logs to S3 Lambda? ›

⚙️ Configure Lambda Function - Serverless CloudWatch Logs To S3 Exporter
  1. Provide the function name & choose Python 3.7.
  2. Attach the IAM Role - serverless-cw-to-s3-exporter-role.
  3. Get code here.
  4. Save the lambda function.

How do I get metric data from CloudWatch? ›

Resolution
  1. Create an input parameter for your GetMetricData API call (metric-data-queries. json). ...
  2. Publish sample metric data as custom metrics using PutMetricData. ...
  3. Run the command aws cloudwatch get-metric-data with your input parameters.
  4. Review the output.

How do I create metrics from CloudWatch logs? ›

To create a metric filter using the CloudWatch console
  1. In the navigation pane, choose Logs, and then choose Log groups.
  2. Choose the name of the log group.
  3. Choose Actions , and then choose Create metric filter.
  4. For Filter pattern, enter a filter pattern.

How often are Amazon S3 storage usage metrics delivered to Amazon CloudWatch? ›

Daily storage metrics for buckets ‐ Monitor bucket storage using CloudWatch, which collects and processes storage data from Amazon S3 into readable, daily metrics. These storage metrics for Amazon S3 are reported once per day and are provided to all customers at no additional cost.

How do I download S3 logs? ›

To download and read a log file

Open the Amazon S3 console at https://console.aws.amazon.com/s3/ . Choose the bucket and choose the log file that you want to download. Choose Download or Download as and follow the prompts to save the file. This saves the file in compressed format.

How do I send application logs to S3? ›

How to
  1. In Destination, select S3.
  2. In Name, enter a human-readable description for the destination.
  3. In Bucket, enter the name of the bucket you created in the S3 account where you want to store logs.
  4. In Folder path, provide the path to the folder within the bucket where you want to store logs.

How do I transfer EC2 logs to S3? ›

Steps to copy files from EC2 instance to S3 bucket (Upload)
  1. Create an IAM role with S3 write access or admin access.
  2. Map the IAM role to an EC2 instance.
  3. Install AWS CLI in EC2 instance.
  4. Run the AWS s3 cp command to copy the files to the S3 bucket.
Jan 19, 2022

How do I extract logs? ›

Start Event Viewer by going to Start > search box (or press Windows key + R to open the Run dialog box) and type eventvwr . Within Event Viewer, expand Windows Logs. Click the type of logs you need to export. Ensure that the Save as type is set to .

How do I query CloudWatch logs in AWS? ›

Amazon CloudWatch Logs
  1. Use the unified CloudWatch agent to get started With CloudWatch Logs.
  2. Use the previous CloudWatch Logs agent to get started with CloudWatch Logs. Quick Start: Install the agent on a running EC2 Linux instance. ...
  3. Quick Start: Use AWS CloudFormation to get started with CloudWatch Logs.

Can CloudWatch read logs from S3? ›

To leverage on AWS CloudWatch capability you can actually forward logs real time from S3 to CloudWatch using the configuration below. The configuration for sending the logs involves the following steps: Create IAM Role with the relevant permission to access S3 and write logs to cloudwatch.

Does S3 have automated backup? ›

Many features are available for S3 backups, including Backup Audit Manager. You can use a single backup policy in AWS Backup to centrally automate the creation of backups of your application data.

How do I transfer data from aws to S3? ›

AWS DataSync
  1. Set up.
  2. Sign in to the console.
  3. Create an agent. Deploy your agent. Choose a service endpoint. Activate your agent.
  4. Discover your storage. Add your on-premises storage system. Start your discovery job.
  5. Transfer your data. Create a source location. Create a destination location. ...
  6. Clean up resources.

References

Top Articles
Latest Posts
Article information

Author: Moshe Kshlerin

Last Updated: 12/08/2023

Views: 5579

Rating: 4.7 / 5 (57 voted)

Reviews: 80% of readers found this page helpful

Author information

Name: Moshe Kshlerin

Birthday: 1994-01-25

Address: Suite 609 315 Lupita Unions, Ronnieburgh, MI 62697

Phone: +2424755286529

Job: District Education Designer

Hobby: Yoga, Gunsmithing, Singing, 3D printing, Nordic skating, Soapmaking, Juggling

Introduction: My name is Moshe Kshlerin, I am a gleaming, attractive, outstanding, pleasant, delightful, outstanding, famous person who loves writing and wants to share my knowledge and understanding with you.