Forum Discussion

Tom_Qin's avatar
Tom_Qin
Level 3
3 years ago

Netbackup backup to cloud - workflow

Hi team, we just configured netbackup (with local cache) backup to S3 cloud storage, the backup size is around 10TB, however, i found the second day, during the backup process, there're around 1/3 data (2-3TB) GET from cloud, is this the normal behaviour? Because download data from costs a lot, far more expensive that we thought. Thanks.
  • Hi Tom_Qin 

    4TB cache should be okay. 

    You really should be able to answer what type of storage you have configured. 

    If (and I say if) the storage is MSDP-Cloud or Cloud Catalyst, each data segment is finger printed. This finger print is compared to a cache on the local media server and the media server will only send the data segment to the cloud if it is new. There should be no reason for any (significant) data reads from the cloud during a normal backup operation.

    It might be worth while opening a support case to have this investigated further (maybe there is something misconfigured that is casuing the excessive reads from the cloud).

    David

4 Replies

  • Hi Tom_Qin 

    It doesn't sound normal, but there are many questions around the configuration of the media server which will affect what happens. 

    • What is the size of the local cache?
    • Is this MSDP-Cloud, Cloud Catalyst or something else?
    • What is the data type you are sending?

    Cheers
    David

    • Tom_Qin's avatar
      Tom_Qin
      Level 3

      Hi David,

      We have 3 clients backup to cloud, the backup size showed in NBU console is around 3TB, 3TB, 4TB. Using MS-Windows backup. Master + Media Server with Local cache directory attached S3 storage from Alibaba cloud.

      For the 3 questions, 

      • What is the size of the local cache? - 4TB (during the backup, cache disk is around 50%-60% usage)
      • Is this MSDP-Cloud, Cloud Catalyst or something else? - I think we should belong  to CC?
      • What is the data type you are sending? - MS-Windows

      I guess, if the cache can't find related block, will it download the data from cloud?

      Thanks.

      • davidmoline's avatar
        davidmoline
        Level 6

        Hi Tom_Qin 

        4TB cache should be okay. 

        You really should be able to answer what type of storage you have configured. 

        If (and I say if) the storage is MSDP-Cloud or Cloud Catalyst, each data segment is finger printed. This finger print is compared to a cache on the local media server and the media server will only send the data segment to the cloud if it is new. There should be no reason for any (significant) data reads from the cloud during a normal backup operation.

        It might be worth while opening a support case to have this investigated further (maybe there is something misconfigured that is casuing the excessive reads from the cloud).

        David