I have a script that takes the postgres backup at some time intervals and uploads the backup to s3. The script was designed more than two years ago and was working perfectly which was deployed on Linux server(Ubuntu 14.04 LTS (GNU/Linux 3.13.0-24-generic x86_64)
). But from some days the backup was not being uploaded to s3 and facing some errors and so had a look at the log and it was as below
uploading test_db.backup.tar.gz to Amazon S3...............
Traceback (most recent call last):
File "/user/projects/test_folder/test/manage.py", line 9, in <module>
execute_from_command_line(sys.argv)
File "/user/projects_envs/testlocal/lib/python2.7/site-packages/django/core/management/__init__.py", line 399, in execute_from_command_line
utility.execute()
File "/user/projects_envs/testlocal/lib/python2.7/site-packages/django/core/management/__init__.py", line 392, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/user/projects_envs/testlocal/lib/python2.7/site-packages/django/core/management/base.py", line 242, in run_from_argv
self.execute(*args, **options.__dict__)
File "/user/projects_envs/testlocal/lib/python2.7/site-packages/django/core/management/base.py", line 285, in execute
output = self.handle(*args, **options)
File "/user/projects_envs/testlocal/lib/python2.7/site-packages/django/core/management/base.py", line 415, in handle
return self.handle_noargs(**options)
File "/user/projects/test_folder/test/core/management/commands/db_backup.py", line 41, in handle_noargs
self.upload_to_s3(file_name, file_path)
File "/user/projects/test_folder/test/core/management/commands/db_backup.py", line 64, in upload_to_s3
response = conn.put(settings.BACKUP_BUCKET_NAME, file_name, S3.S3Object(tardata))
File "/user/projects/test_folder/test/storage/S3.py", line 192, in put
object.metadata))
File "/user/projects/test_folder/test/storage/S3.py", line 276, in _make_request
connection.request(method, path, data, final_headers)
File "/usr/lib/python2.7/httplib.py", line 973, in request
self._send_request(method, url, body, headers)
File "/usr/lib/python2.7/httplib.py", line 1007, in _send_request
self.endheaders(body)
File "/usr/lib/python2.7/httplib.py", line 969, in endheaders
self._send_output(message_body)
File "/usr/lib/python2.7/httplib.py", line 829, in _send_output
self.send(msg)
File "/usr/lib/python2.7/httplib.py", line 805, in send
self.sock.sendall(data)
File "/usr/lib/python2.7/ssl.py", line 329, in sendall
v = self.send(data[count:])
File "/usr/lib/python2.7/ssl.py", line 298, in send
v = self._sslobj.write(data)
socket.error: [Errno 104] Connection reset by peer
Code
from storage import S3
def upload_to_s3(self, file_name, file_path):
print "uploading " + file_name +'.tar.gz' + " to Amazon S3..............."
conn = S3.AWSAuthConnection(settings.AWS_ACCESS_KEY_ID, settings.AWS_SECRET_ACCESS_KEY)
#get all buckets from amazon S3
response = conn.list_all_my_buckets()
buckets = response.entries
#is the bucket which you have specified is already there
flag = False
for bucket in buckets:
if bucket.name == settings.BACKUP_BUCKET_NAME:
flag = True
#if there is no bucket with that name
if flag == False:
print "There is no bucket with name " + BUCKET_NAME + " in your Amazon S3 account"
print "Error : Please enter an appropriate bucket name and re-run the script"
return
#upload file to Amazon S3
tardata = open(file_path+'.tar.gz', "rb").read()
response = conn.put(settings.BACKUP_BUCKET_NAME, file_name, S3.S3Object(tardata))
...............
...............
So whats wrong with the s3 package ? and why it suddenly stopped working ? does this really relates to s3 or else something about linux packages ?