Backup is one of the basic security elements that ensure meeting the requirements set in the Business Continuity and Disaster Recovery processes.
While implementing the process of creating backup policies one should be aware of several key aspects. Among other things, parameters determined by time:
- RTO – recovery time objective – in other words, the time needed to recover data in the event of a failure (usually including the restoration of full system functionality).
- RPO – recovery point objective – means how much data can be lost to the maximum (which is directly related to the backup mechanisms and its retention).
At the moment of a sudden failure and data loss, thanks to a previously made backup, we can quickly restore them to the state before the unfortunate fault.
Why backup is an absolute “must have”?
Firstly, the latest technologies can also have a “bad day”. We do not have 100% certainty about the reliability of the solutions. An example of this is a recent loss of data by Myspace portal, which during data migration did not perform their backup and data from 12 years disappeared in the blink of an eye.
Secondly, it is worth to be “always insured”. Most files and documentation are currently digitized. If we do not have any backup, we have little chance of understanding on the part of the client or institution when their data is permanently lost by us.
Thirdly, a backup can facilitate the work on large projects. Each of us experienced the crash of a software or power cut. Loss of a small fragment and its reproduction is tedious and does not allow further implementation of the project. In the long run, if we lose a small fragment, then we get irritated that we have to reproduce it again. And what will happen when the loss of larger amounts of files appear? It can freak us out.
Fourthly, for us (Lcloud) as an authorized partner of cloud services, it is important that each solution is created by us in the best possible way for our customer. Therefore, in addition to the highest quality of security in accordance with the best practices, we always have a Disaster Recovery plan developed, which obviously includes backup.
Fifthly, as the golden rule says “time is money”. Especially now, when the hardware or software fails, we do not want to spend hours re-configuring them. The same applies to access and passwords for accounts, emails and disks. Another aspect is the risk of data loss as a result of deliberate action or an employee error. It is worth having copies of the data in a safe place.
This AWS’ service is a great solution to all our worries. It allows you to create backups, facilitates centralization and automation of their creation in the AWS cloud, as well as on-premise, eliminating the need to create your own lambda-based scripts or mechanisms or DataLifecycleManager policies. With just a few clicks in the AWS console, you can build backup policies that automate their creation schedules and manage their storage. Interestingly, AWS Backup allows you to define common policies for different types of resources. You do not need to implement separate mechanisms for Amazon RDS, Amazon EFS and Amazon EBS. One policy is enough, for example, a daily backup with 30 days of retention and by adding tags, you get one mechanism that provides data backup for various types of storage, provided by AWS.
The basic components of the service are:
- Recovery points – they represent the content of a resource at a given time. They also contain metadata, such as asset information, restore parameters, and tags.
- Backup vault – is a logical backup container for recovery points that allows them to be organized /catalogued.
AWS Backup is fully automated, ensuring the highest level of security. It saves you time and thanks to Backup On Demand and Recovery Points you can easily restore your backup whenever you want.
How to create the first backup?
You start the configuration process by logging in to the AWS console, where you will find AWS Backup, like many other services. The entire sequence of actions can be reduced to the four main steps that we have outlined in the diagram below. Detailed configuration can be found in Jeff Barr’s blog post.
NOTE! AWS take care of the best practices and guarantees the removal of old copies in the AWS Backup service, ensuring the optimization of costs when using it.
Security and compliance
The service provides access control and encryption features that help protect data and meet compliance requirements. By using AWS Identity and Access Management (IAM), you can manage backup prerequisites, such as:
- restoring backups,
- managing backup plans,
- assigning resources to backup plans.
When discussing security issues, it is worth mentioning the compliance of AWS Backup. Amazon Web Services has the longest-running cloud compliance program and actively supports its clients in compliance activities. That is why the AWS Backup service complies with PCI DSS, ISO 9001, 27001, 27017 and 27018 as well as HIPAA. This makes it easier to verify AWS cloud security and customize your security to stringent requirements.
AWS Backup is available in the USA (in the regions of North Virginia, Ohio and Oregon), and also in Europe – in the region of Ireland. Before starting work, we encourage you to check the availability of services in the Table of Regions, which is constantly updated.
How much does it cost?
Thanks to the pay-as-you-go plan, you pay only for the number of gigabytes actually used (giga-hours or giga-months). There is no minimum fee and no configuration fees. You are billed for actual data usage to back up or restore it. The AWS Backup prices depend on the amount of disk space that will occupy the data for backing up. For the first copy of the AWS resource, their full version is being saved. For each subsequent backup, only the changed part of the AWS resource is being saved.
AWS Backup in the settlement is close to the Amazon S3 service, which is one of the cheapest of Storage AWS services. It is also worth mentioning the possibility of using cold storage for Amazon EFS backup. In practice, this is Amazon Glacier, a service that is even cheaper than the aforementioned S3. It should be remembered, however, that the time of restoring the data collected in the Glacier is longer than in the case of S3.
Why it is worth it?
Summarizing; implementation of AWS Backup is a response to the worries mentioned at the beginning. First of all, it gives us the opportunity to work on projects, without the stress of irretrievable data loss. Reduces system/software configuration time after a sudden failure or needed update. Saves time for the customer during an unexpected power cut and malfunction. What’s important when using AWS Backup on Amazon EFS, the backup has no effect on performance and Burst Credits are not utilized.
The advantages could be exchanged without end.
In addition, we encourage you to download our infographic, where you will find AWS Backup in a nutshell.