09-28-2011 12:19 PM
Hello guys,
I'm updating BE 12.5 to 2010 and want to ask you about backup strategy.
Currently I have Full backups jobs set on Fridays and Incremental Backups Monday thru Thursday. So, If I need full restore on Wednesday for example, I have to restore lst Friday full backup and then Monday, Tuesday and Wednesday incrementals in the sequence.
Now, with BE 2010 I'm planning to use deduplication. Can you explain to me how my backup plan would be changed according to the following statement:
"Because continuous data protection backs up data residing on file servers as it changes, open files are protected, and INCREMENTAL or DEFFERENTIAL backups ARE NOT NEEDED".
I believe to start I woud need to do Full backup. Then I guess I would need to setup daily Deduplicated backups... Is it correct?
If so, if I need full restore in a week from my original full backup, would I need to start restoring from the full backup and then use each Deduplicated backups up to restored date? Or I just restore from the last Deduplicated backup to have full dataset at that date?
Is last statement is correct, do I ever need to do full backup again?
I know, I asked a lot of "dummy" questions, but keep in mind I have no experience with BE 2010 (only BE 12.5) and I don't have test servers to play with, so have to setup everything right in production environment.
Thank you,
Mike
09-28-2011 12:30 PM
With Deduplication agent you may run full backups everyday. 1st backup to the dedup folder will
backup all the files and folders available in the selection list, subsequent full backup of the same
selection list will grab only modified data. So at the time of restore you will have to select the latest
backup set only.
Note : Dedup is used to save space,lot of processing happens in the background so the backup will be
time consuming.
09-28-2011 12:40 PM
"continuous data protection" is a function of the Continuous Protection Server software and is seperate from the Deduplication Option.
Full and Incremental backups schedules should remain the same and simply change your taret to the deduplication storage folder.
If full backups are processed every day, then the entire selection is scanned for changed data. By replacing the everyday full with an incremental, the remote agent for windows servers processes the selection list first for changed files, either by archive bit or modified time. After the changed files are identified, then the backup starts and deduplication happens.
Also, adding client-side deduplication will help with the processing of data by off setting it to the remote machines.
Please review the deduplication and client-side requirements before installing.
09-28-2011 07:32 PM
I think it would be easier for you if you think of the dedup folder as a special type of storage media. What you learn about restoring full and and incremental backups in BE 12.5 still applies. As a storage media, it only affect your backup strategy in a small way. Because dedup stores only the changed data blocks, you can just do full backups. Subsequent full backups are like incremental backups because only the changed blocks are stored. However, full backups means that all the files are processed whereas incremental backups only process the modified files.
09-29-2011 07:45 PM
I disagree with you on this point: "Subsequent full backups are like incremental backups"
A full backup is still a full backup. The entire selection list has to be scanned for changes via the deduplication process.
However, incorporating incremental backups will pre-filter only changed files for the deduplication process to scan.
Thus reducing both time and storage costs.
09-29-2011 08:06 PM
If you would read my post carefully, you would see that I am refering to the back-end storage standpoint, i.e. only the modified blocks from the subsequent full backup would be stored. I did caution the user that the front-end processing remains that same.
10-04-2011 02:44 AM
Hello guys,
I'm also confusing about deduplication folder and backup method. So if I understand:
Is that right ?
10-10-2011 08:54 AM
"Subsequent full backups are like incremental backups"
No, they are not like incremental backups.
Incremental backup of a file is triggered by a file that has been changed, either the archive bit has been changed or modified time has changed.
Full backups disregard archive bit/modified time and backup the file regardless.
This processing of whole files is far above block level change checking and hash processing.
Saying they are the same is incorrect.