If you keep autoscaling your servers after some interval of time to shutdown/terminate old servers and create new to have fresh servers up and running each time, then you may want to preserve files from the server like application logs, server logs etc.
There are many ways to do it by integrating loggly, data dog, AWS cloud watchor third-party tools in which agent script of the tool installed on the server will keep pushing specified data to its storage repeatedly at some interval so that you can manage those data.
There is another simple and easy way to just take a backup of the files/logs you want to store to s3 which I am talking about in below.
Create a shell script file named preserve_log.sh and write below codes:(you will need to setup AWS client on the server and s3 policy/role set for below code to work)
Plain text
Copy to clipboard
Open code in new window
EnlighterJS 3 Syntax Highlighter
# This is the instance id of the aws instance on which the script will run.
# You can set value of this variable to any thing to uniquely identify the server.
instance_id="$(ec2metadata --instance-id)"
# In below application log will get moved to s3 path for current date assuming logs files
# are getting created every day based on current date.(assuming your logs files are at "/var/www/html/".
# This is the instance id of the aws instance on which the script will run.
# You can set value of this variable to any thing to uniquely identify the server.
instance_id="$(ec2metadata --instance-id)"
# In below application log will get moved to s3 path for current date assuming logs files
# are getting created every day based on current date.(assuming your logs files are at "/var/www/html/".
aws s3 cp /var/www/html/logs/log-`date +%Y-%m-%d`.log s3://application-log/log-`date +%Y-%m-%d`-$instance_id.log
# This is the instance id of the aws instance on which the script will run.
# You can set value of this variable to any thing to uniquely identify the server.
instance_id="$(ec2metadata --instance-id)"
# In below application log will get moved to s3 path for current date assuming logs files
# are getting created every day based on current date.(assuming your logs files are at "/var/www/html/".
aws s3 cp /var/www/html/logs/log-`date +%Y-%m-%d`.log s3://application-log/log-`date +%Y-%m-%d`-$instance_id.log
Now, set the script in crontab/cron job to run at some interval lets say every 50th minute in an hour.
Plain text
Copy to clipboard
Open code in new window
EnlighterJS 3 Syntax Highlighter
50 **** preserve_logs.sh
50 * * * * preserve_logs.sh
50 * * * * preserve_logs.sh
Also if you want to move files/log to s3 when server get shutdown/terminate then create the fileshutdown-hook.conf in /etc/init if not present(i.e./etc/init/shutdown-hook.conf). if already present then edit the file and add below codes and save.
Plain text
Copy to clipboard
Open code in new window
EnlighterJS 3 Syntax Highlighter
description "run at shutdown"
start on starting rc
task
# Below is the path for the cron/shell script that you created above.
exec /bin/bash preserve_logs.sh
description "run at shutdown"
start on starting rc
task
# Below is the path for the cron/shell script that you created above.
exec /bin/bash preserve_logs.sh
description "run at shutdown"
start on starting rc
task
# Below is the path for the cron/shell script that you created above.
exec /bin/bash preserve_logs.sh
So now with above, preserve_logs.sh will execute to move specified files/logs to s3 before server terminates.
I hope it helps you to take back up of file to AWS s3 or any external path you defined in script.
Set up AWS cloud watch Logs: Configure your IAM role or user for Cloud Watch Logs: Cloud Watch uses the Identity and Access Management (IAM) service for authentication and authorization so follow step:1 in below AWS doc link. http://docs.aws.amazon.com/AmazonCloudWatch/latest/DeveloperGuide/QuickStartEC2Instance.html Download AWS cloud watch logs agent setup file: wget https://s3.amazonaws.com/aws-cloudwatch/downloads/latest/awslogs-agent-setup.py Run AWS…
This post is useful for people who are maintaining their code on below scenarios: You are using aws cloud server to manage your application codebase. Using aws autoscaling feature with launch configuration for autoscaling server. After server codebase updated, you create instance AMI then create a launch configuration using that…
This post is useful to those who want to convert AWS ec2 instance from PV type to HVM type. Linux Amazon Machine Images(AMI) uses two type of virtualization which is either ParaVirtual (PV) or Hardware Virtual Machine (HVM). If you are running any AWS ec2 instance having PV virtualization type and wants to…
Thanks Sandeep nice article
Yo Man. Good to know that you liked it.