Appliance WebUI does not display information
Hello We are using BE3600R2 and whenever we using WebUI we are unable to find details related to Disk Raid Battery It says No disk No Raid and No battery. Please find the attached screenshot for more information. Thank you in advanceSolved912Views1like3CommentsDedupe doesn't seem to be working properly 3600 R3 was R2
Appliance was R2, was upgrading to R3 twice, once in September and once again at the end of October. Ops center crashed on the first appliance. 3 months ago I was sitting at 2.3 to 2.5TB of dedupe storage at a 16.1:1 ratio, this was directly before the first upgrade. So I had about 37.03-40.25TB of hydrated data. Since then I obviously have upgraded the appliance on around Sept 16. Looking in the adamm.log is where you can really tell it starts to grow from that point until now, because within 2 weeks it climbed a TB. Now I am at 4.88 TB and about to fill up the storage. I am manually having to delete the oldest backups sets trying to reclaim some data. My dedupe ratio is now at 10:1. Since I have only jumped up 7-8TB of hydrated data, but yet have doubled the amount of space taken up with the dedupe something seems wrong. I have only added 3 servers to the mix since Sept 16th, which each house about 50GB of data. All jobs follow a basic template with a few being different: 1 full a month with differentials in between. The fulls kept for 95 days and the differentials kept for 32 days. I have a ticket open on this but the tech doesn't believe anything is wrong. We looked in the audit log and it show media being reclaimed and deleted, but I don't think this is a consistent case. Right now I have a lot of Expired backup sets that are not being deleted. Some I think are being affected by the known bug that is not deleting backup sets that were taken with 2012. Other though I expired yesterday morning and they are still there, other are deleted right away. My environment is all Windows. I have almost half 2012R2, almost half 2003 and a couple of 2008 and 2 Windows 7. I only have 2 or 3 out of 20 servers being backed up that have more than 100GB on them and only 10 of the have more than 50GB. One of the ones with more than 100GB has 800GB of data on it, 1/4 of it being pictures which I know do not dedupe well, but this servers was being backed up pre upgrade to 2012 with no issues. The other server having more than 100GB of data is a file server with 200-300 GB on it, which resides on a windows dedupe storage which takes it down to 150-200GB. To me there is no way, in my environment that going from 37-40TB of hydrated data/2.3-2.4TB dedupe at 16.1:1 ratio, and it sat this way for 4 months leading up to the 2014 upgrade.....to 47-48TB of hydrated data with 4.88Tb when deduped at 10:1 ratio within 2 months of the upgrade is correct. The only reason I haven't filled up my dedupe yet is I am manually expiring backup sets. But it is still climbing and the dedupe ratio is going down and I won't be able to fight it off for longer without doing more drastic expirations. Is anyone else experiencing something similar? Is there anymore information I can give that would help?824Views0likes4CommentsV-79-57344-759 - Unable to complete the operation for the following reason: VFF Open Failure
GRT backup of Exchange to deduplication disk fails with error: V-79-57344-759 - Unable to complete the operation for the following reason: VFF Open Failure. This can be caused by low memory or disk resources. The odd point is that for the second time this error happenend only for one of the three databases that are being protected. Backup strategy is Full weekly + Differential daily. When job has failed, it has been the same database and differential backup (logs) Environment: Backup Exec 2012 SP3 / BE3600 Appliance R2 Exchange 2010 SP1 no rollup (DAG 2 servers)x64bit Windows servers --Backup up from the passive copy and if not available, try the active copy (recommended)--- I found this technote, but in our environment the backup is not store in tape : http://www.symantec.com/docs/TECH127758 I really appreciate any ideas about this issueSolved2KViews1like1Comment3600 appliance opinions
Looking for some real world opinions on the 3600 appliance. Thinking of using them at several branch offices throughout the country. work well? ease of setup? what would you do different? all that stuff! I am familiar with using BE software. Thanks, matt1.5KViews0likes18CommentsDeduplication ratio with EXchange 2010 DAG
Hi first just to tell than I am newbie to deduplication Backup Exec... I were using dedupliacation for the first time this weekend with my Exchange 2010 DAG. The backup completd successfully but the dedup ration was 1%. here is the info from the job log "Deduplication Stats::scanned: 1770063047KB, sent: 1764883390KB, dedup: 1 percent" My question is : Is it normal and I will see better dedup ratio in the next full backups or I have something misconfigured? Here is some info on my environnment Exchange 2010 SP3 server configured in a DAG Backup Exec 3600 R2 applaince fully patched Exchange servers OS are Windows 2008R2 SP1 The job is configures to go on the appliance dedup local storage, the backup is taken from the active copy (the passive is copy is not on the same site thant Backup exec appliance) and I'm using GRT for the backup do you have some advice or few hints for me? thanks in advanceSolved1.2KViews0likes3CommentsBackup Encryption Overhead
Hi We are testing backup encription using BUE 2012 3600 appliance and are noticing some significant overheads of the encryption. Can anyone advise it there and any published statistics of what overhead is realisitc. Applaince is: Windows 2008 Server BUE 2012 with SP2 installed 4 CPU 16GB RAM 4 x 1GB NIC but with only 1 in use Attached HP MSL 2024 tape library SAS attached Out test results to local appliance disk are as below: Job type – D2D Time mins Throughput mps %Overhead Non encrypted or compressed Encrypted – Software compression – 25gb – small files – disk1 28.21 1,225,00 +45.8 Encrypted – No compression – 25gb – small files – disk1 21.51 1,677,00 +6.16 Not Encrypted – Software compression – 25gb – small files – disk1 19.35 1,733,00 3 Not Encrypted – No compression – 25gb – small files – disk1 18.23 1.786,00 Job type – Duplicate to tape Time mins Throughput mps %Overhead Non encrypted or compressed Encrypted – Software compression 57.47 894 +255 Encrypted – No compression 38 916 +248 Not Encrypted – Software compression 44 1,495,00 +35 Not Encrypted – No compression 18.55 2,283,00 Any thoughts or documentation (other than admin guide) would be appriciatedSolved1KViews0likes2CommentsDedupe performance
A few question on dedupe: 1) What is the client requirement to use client-side dedupe? e.g. can a client w/ 2GB RAM do client-side dedupe? Do I need to set anything on the Remote Agent on the client? 2) I ran the same full backup to dedupe storage job two times in a roll. It is a files backup job. The 1st job took 11 min and 2nd job took 18 min. Isn't that dedupe supposed to copy the changed block only? I would expect the 2nd job complete very quickly but it is not? Am I mis-understanding the concept here? (but differential job seems very quick sometimes) 3) Should I enable Advanced Open File for dedupe backup jobs? For File/SQL/Exchange? Will it affect performance? I'm new to this dedupe thing and trying to understand more. Thanks all. Joseph.Solved797Views1like4CommentsRemote Backup
Hello, We currently have two BE 3600 r2 appliances. Both are backing up all our servers. For critical servers Full backups weekly then daily incremental Rest Full backup bi-weekly then daily incremental We also have a another location, which is connected with 20M connection vpn(our main site has 20M up, and secondary site has 25down) I have been trying to figure out what the best way, I can backup our "backups" to that site. Because I can add network storage as destination on the BE appliance, I was thinking I can have a NAS device at the remote site, and that start backing up. But I have about 4.5 T of data in total. I can backup all that locally first and then take the device to the remote location. But then because my local backup are essentially chaning every week, how can I modify my existing backup stragedy so I can have local backups along with remote backups. Any suggestions on this would be greatly appreciated. Thanks, GurpreetSolved804Views0likes3Comments