cancel
Showing results for 
Search instead for 
Did you mean: 

Netbackup backup to cloud - workflow

Tom_Qin
Level 3
Hi team, we just configured netbackup (with local cache) backup to S3 cloud storage, the backup size is around 10TB, however, i found the second day, during the backup process, there're around 1/3 data (2-3TB) GET from cloud, is this the normal behaviour? Because download data from costs a lot, far more expensive that we thought. Thanks.
1 ACCEPTED SOLUTION

Accepted Solutions

Hi @Tom_Qin 

4TB cache should be okay. 

You really should be able to answer what type of storage you have configured. 

If (and I say if) the storage is MSDP-Cloud or Cloud Catalyst, each data segment is finger printed. This finger print is compared to a cache on the local media server and the media server will only send the data segment to the cloud if it is new. There should be no reason for any (significant) data reads from the cloud during a normal backup operation.

It might be worth while opening a support case to have this investigated further (maybe there is something misconfigured that is casuing the excessive reads from the cloud).

David

View solution in original post

4 REPLIES 4

davidmoline
Level 6
Employee

Hi @Tom_Qin 

It doesn't sound normal, but there are many questions around the configuration of the media server which will affect what happens. 

  • What is the size of the local cache?
  • Is this MSDP-Cloud, Cloud Catalyst or something else?
  • What is the data type you are sending?

Cheers
David

Hi David,

We have 3 clients backup to cloud, the backup size showed in NBU console is around 3TB, 3TB, 4TB. Using MS-Windows backup. Master + Media Server with Local cache directory attached S3 storage from Alibaba cloud.

For the 3 questions, 

  • What is the size of the local cache? - 4TB (during the backup, cache disk is around 50%-60% usage)
  • Is this MSDP-Cloud, Cloud Catalyst or something else? - I think we should belong  to CC?
  • What is the data type you are sending? - MS-Windows

I guess, if the cache can't find related block, will it download the data from cloud?

Thanks.

Hi @Tom_Qin 

4TB cache should be okay. 

You really should be able to answer what type of storage you have configured. 

If (and I say if) the storage is MSDP-Cloud or Cloud Catalyst, each data segment is finger printed. This finger print is compared to a cache on the local media server and the media server will only send the data segment to the cloud if it is new. There should be no reason for any (significant) data reads from the cloud during a normal backup operation.

It might be worth while opening a support case to have this investigated further (maybe there is something misconfigured that is casuing the excessive reads from the cloud).

David

I got update from Veritas. When local cache is full, and if there's no reference in cache, it will get those block or data from cloud to cache. So it results lots of data downloaded from cloud.