How to Backup your QNAP NAS to AWS S3 using HBS3
Introduction:
Using the cloud for storage is not a new thing for a lot of people, the cloud services have been around for a while now, after the AT&T PersonalLink services launched in 1994, cloud storage services have gone a long way.
With that being said, if you have your own NAS Storage device like QNAP NAS, online cloud storage backup for your NAS can be a considerable solution, giving that it saves you a lot of overhead to maintain and operate your storage backup.
Why would you want to Backup your NAS to AWS S3?
Amazon S3 ( Simple Storage Service ) is a scalable high durability cloud storage service, that is highly scalable and highly available, it would make a great cloud backup solution.
Additionally, QNAP HBS3 Backup simplifies and schedules the backup to Amazon S3.
Setting Up your Backup:
Step – 1 – Login to your AWS console and create a new S3 Bucket:
The S3 Bucket name should be globally unique, for this example it’s “mynas-backup-1996”
Scroll down, we have more configurations while creating S3 Bucket, like Versioning , Object luck , and public access, etc…
In this example I will leave everything as default and choose “Create Bucket”:
Now after the bucket is created, we need to create an IAM user for HBS3 to use to write files to the Bucket.
Step – 2 – create IAM user:
Head to IAM console then Users, and then create user:
This user doesn’t need console access, so you can leave the option “Provide User access to the AWS Management Console “ unchecked, then click “Next”:
For permissions, I will create a custom Policy, because it’s better to stick to the “Principle of the Least Privilege”, which means we grant the user access only to specific data or actions that required to complete the task,
Now, on permissions page, choose “attach policy directly” , then “create policy”:
We will allow this user to make the actions (ListBucket, GetObject, DeleteObject, PutObject, ListAllMyBuckets, GetBucketLocation ) on this Bucket and the objects of this Bucket, it should look like this:
{ “Version”: “2012-10-17”, “Statement”: [ { “Sid”: “BucketLevel”, “Effect”: “Allow”, “Action”: [ “s3:ListBucket” ], “Resource”: [ “arn:aws:s3:::{Bucket Name}” ] }, { “Sid”: “ObjectLevel”, “Effect”: “Allow”, “Action”: [ “s3:PutObject”, “s3:GetObject”, “s3:DeleteObject” ], “Resource”: [ “arn:aws:s3:::{Bucket Name}/*” ] }, { “Sid”: “ListAndGetBucketLocation”, “Effect”: “Allow”, “Action”: [ “s3:ListAllMyBuckets”, “s3:GetBucketLocation” ], “Resource”: [ “*” ] } ] } |
for more information about IAM polisies, check AWS Documentation
In order to make this, when you click on “create policy” it will open the policy editor, choose JSON and paste the above policy in the policy editor
After this, click “Next”, then give the policy a name, then, scroll down and click “ Create Policy”:
Once the policy created, we can attach it to the user:
Then, click “Next” and review then “Create User”:
Step – 3 – Create Access Key for the IAM User:
On the IAM Console, go to the users section and click on the username that you created:
Then go to the “ Security Credentials “ tab, and scroll down to “Access keys” , then click on “create access key” button:
The options should be as follows :
Then click “Next”
Optionally, you can add tags to describe the access key, then “create access key”
Once created, you can download the access key id & the access key as a csv file, make sure to save it.
Step – 4 – Login to you Qnap NAS web management and Create HBS3 Sync Job:
Open HBS3 Program:
Then head to Sync Section and Create a “One-Way Sync Job”:
Now choose the destination as “ Amazon S3 & S3 Compatible” :
Then, copy and paste the Access Key id & Secret Access Key from the csv file downloaded earlier to their fields respectively, then click “Create”:
If the permissions are assigned correctly, then the creating should be successful and you can choose the bucket name from the drop-down menu:
Now choose the Source folder from your Local NAS and the destination folder in the S3 Bucket:
* you can add multiple folders to map to S3 up to 16 folders per Job.
Now, you can schedule the Sync Job however you see fit:
For the Rules, I will leave everything as default, but feel free to discover the available options:
Review and Create:
Step – 5 – Start the Sync and verify:
Click on “Sync Now”, to test the Sync job
I have a test.txt file inside the mapped folder:
Now we can see that the file has been copied to the S3 Bucket:
Conclusion:
Backup data to Amazon S3 gives you scalability and durability, with its lifecycle policies, you can move your data to more cost effective tiers, however, for smaller businesses or personal data, it might be complex and costly, and it’s a good idea to backup your data to an External Hard drive.
Check out this guide how to backup your Qnap NAS to an external hard drive.