Skip to content
Home » Preserve application logs to external storage or aws s3

Preserve application logs to external storage or aws s3

cloud sync

If you keep autoscaling your servers after some interval of time to shutdown/terminate old servers and create new to have fresh servers up and running each time, then you may want to preserve files from the server like application logs, server logs etc.

There are many ways to do it by integrating loggly, data dog, AWS cloud watch or third-party tools in which agent script of the tool installed on the server will keep pushing specified data to its storage repeatedly at some interval so that you can manage those data.

There is another simple and easy way to just take a backup of the files/logs you want to store to s3 which I am talking about in below.

Create a shell script file named preserve_log.sh and write below codes:(you will need to setup AWS client on the server and s3 policy/role set for below code to work)

Plain text
Copy to clipboard
Open code in new window
EnlighterJS 3 Syntax Highlighter
# This is the instance id of the aws instance on which the script will run.
# You can set value of this variable to any thing to uniquely identify the server.
instance_id="$(ec2metadata --instance-id)"
# In below application log will get moved to s3 path for current date assuming logs files
# are getting created every day based on current date.(assuming your logs files are at "/var/www/html/".
aws s3 cp /var/www/html/logs/log-`date +%Y-%m-%d`.log s3://application-log/log-`date +%Y-%m-%d`-$instance_id.log
# This is the instance id of the aws instance on which the script will run. # You can set value of this variable to any thing to uniquely identify the server. instance_id="$(ec2metadata --instance-id)" # In below application log will get moved to s3 path for current date assuming logs files # are getting created every day based on current date.(assuming your logs files are at "/var/www/html/". aws s3 cp /var/www/html/logs/log-`date +%Y-%m-%d`.log s3://application-log/log-`date +%Y-%m-%d`-$instance_id.log
# This is the instance id of the aws instance on which the script will run.
#  You can set value of this variable to any thing to uniquely identify the server.
instance_id="$(ec2metadata --instance-id)"

# In below application log will get moved to s3 path for current date assuming logs files
# are getting created every day based on current date.(assuming your logs files are at "/var/www/html/".
aws s3 cp /var/www/html/logs/log-`date +%Y-%m-%d`.log s3://application-log/log-`date +%Y-%m-%d`-$instance_id.log

Now, set the script in crontab/cron job to run at some interval lets say every 50th minute in an hour.

Plain text
Copy to clipboard
Open code in new window
EnlighterJS 3 Syntax Highlighter
50 * * * * preserve_logs.sh
50 * * * * preserve_logs.sh
50 * * * * preserve_logs.sh

Also if you want to move files/log to s3 when server get shutdown/terminate then create the file shutdown-hook.conf in /etc/init if not present(i.e./etc/init/shutdown-hook.conf). if already present then edit the file and add below codes and save.

Plain text
Copy to clipboard
Open code in new window
EnlighterJS 3 Syntax Highlighter
description "run at shutdown"
start on starting rc
task
# Below is the path for the cron/shell script that you created above.
exec /bin/bash preserve_logs.sh
description "run at shutdown" start on starting rc task # Below is the path for the cron/shell script that you created above. exec /bin/bash preserve_logs.sh
description "run at shutdown"
start on starting rc
task
# Below is the path for the cron/shell script that you created above.
exec /bin/bash preserve_logs.sh

So now with above, preserve_logs.sh will execute to move specified files/logs to s3 before server terminates.
I hope it helps you to take back up of file to AWS s3 or any external path you defined in script.

2 thoughts on “Preserve application logs to external storage or aws s3”

Leave a Reply

Your email address will not be published. Required fields are marked *