04-12-2013 06:51 PM
Solved! Go to Solution.
04-14-2013 01:40 AM
Don't think Netbackup will be a limit in the year to come in regards to size limits.
04-13-2013 04:57 AM
hi ,
actually there is no such limit as far as i know..
taking a backup of huge data without any issues is always depends on how strong and the way your netbackup enviornment is build.
hardware capacity of Media server and client
Network speed and capacity
disk capacity and I/O perfrmance.
load on the Master/media and clients
why of bakckup:-
from dump as FS, or from DB agent..
data travel path.. etc
when you have a huge data, its always better to make the client as media server and send directly to storage( SAN media server), it will reduce one data travel hop and gives you better performnace.
hope this helps..
04-13-2013 07:54 AM
For Oracle would recommend to go for De-Duplication. Suggest them it's best option
Along with FT Media server-SAN client
For SQL Large backups. We can split database backups to multiple streams . Try with that
It's all multiplexing and multistreaming with SQL and even with Oracle too... Try that to run database backup on multiple drives concurrently with less window ;)
04-14-2013 01:40 AM
Don't think Netbackup will be a limit in the year to come in regards to size limits.
04-15-2013 12:13 AM
Nicolai pretty much said it for the size limit.
If you have performance issues, you can tune some settings. If you are doing a BCV backup of the Oracle database you should use checkpoint option in the policy, it works in 7.1.
04-16-2013 11:19 AM
Lots of good info everyone.
I guess they are have a performance issue with these DB large backups at this site.. I have a feeling some work is going to be dumped on me