Showing results for 
Search instead for 
Did you mean: 

Imagem duplicating beetwen 2 DataDomains Storages

Level 3

Hello Everyone,

Need some help in duplicating images.

In this moment i have 2 DataDomains, one DD2500 and the other DD6300, connected to the same media server.

Our ideia is to migrate all backup images from the DD2500 to the DD6300. The currents jos is already being saved in the storage units from the new DD6300.

But i'm having some problems to duplicate the images. When i use the duplicate option from the catalog list, the job did image by image, and it will never end.

Is there any other option that i can use? Is this the correct way to do this task?

The old DD2500 needs to be shutdown, he is going to be moved to another site.

Thank You.




I think, you do not have any other option than manually duplicating the old data on DD2500 to DD6300.

I hope you have 10g connection for both appliances from media server. 

Plan it, do it in phases and do not run everything under one job. You can run several duplication jobs according to data or policy or host etc in your environment.

Partner    VIP    Accredited Certified

It would have been a bit easier if you asked this question before you deployed the DD6300 and started writing to it - in this case I'd have recommended to start with Collection replication and mirror the entire box which is much faster than going one by one.

Now as you seem to be stuck with Managed File Replication - I can give you some hints but the success of the task really depends on how quickly you can find the bottleneck in your configuration:

  1. NBU duplication through "Catalog" tree menu works but it's really not designed for heavy duty processing, in order to achieve massive parallelism you'd need to script migration and run bpduplicate in parallel for as many streams as 2500 supports.
  2. Make sure you use as many interfaces for replication traffic as possible and they are all 10GbE

Another option to consider is "seeding" the 6300 prior to replication starting in this case you probably could set up Directory replication to a temporary folder and delete it after you run Managed File Replication through DD Boost. This unlikely to speed up the entire process much but will make bpduplicate run faster because all blocks will be already present at the target.

Partner    VIP   

Wait - as backup expires, the number of backups to replicate will be greatly reduced.

Thanks everybody for the Answers, gaves me some north to follow.

I tryed some steps and it doesnt work.

First i tryed to manually duplicate the images from the GUI method, i have select about 100 images, and after some hours the duplication failed returning error Code 50 client process aborted.

Then i tryed the same copy with the bpduplicate, appointg the images in a bid file. and after some hours the same error message appears. But in different images, so i supposed there's nothing corrupted.

Then i tryed a faster Way, i created a .bat file script with some lines like these:

start cmd /k ""D:\program files\veritas\netbackup\bin\admincmd\bpduplicate.exe" -dstunit "OST_DataDomain" -backupid XXXXXXXXXXXXXXXXXXXX"

start cmd /k ""D:\program files\veritas\netbackup\bin\admincmd\bpduplicate.exe" -dstunit "OST_DataDomain" -backupid YYYYYYYYYYYYYYYYYY"

start cmd /k ""D:\program files\veritas\netbackup\bin\admincmd\bpduplicate.exe" -dstunit "OST_DataDomain" -backupid XXXXXXXXXXXXXXX

I dont receive the error 50, but starting about 100 images simultaneosly got my media server CPU at 100% and Freezes him. Even if i stop the duplication in the activity monitor it keeps freezes in 100%, the strange is that on normal backups and replications with SLP i see more than 100 tasks in activity monitor and the media server CPU keeps calm. Did someone have any other alternative?



Partner    VIP    Accredited Certified

DD2500 supports only 90 outgoung replication streams. This is why I recommended to consult with DD guides before throwing batch jobs and hope NBU will figure it out, I'd reduce the number of batch jobs to 50 and see how it goes.

Also it's worth checking if you don't have other constraints such as CPU and Memory.