cancel
Showing results for 
Search instead for 
Did you mean: 

need to fetch report for all images on a Data domain with size details

nikhilg17
Level 2

Hi All, We have Solaris master server and using netbackup 7.6.2 version. backups are writing to Data domain which later getting duplicated to Tape drives.

one of our Data domain exceeded threshhold limit and now we want to clean some data from it in order to release some space on Data Domain.

My managment wants a report which can show them list of all images resides on DD with their respective sizes. So that later they can decide what should be deleted from DD and what should not.

I tried to fetch some data from catalog but its not providing me size detail. Also we have opscenter analytic for reporting and i tried to fetch relevant reports from that but didnt get success. 

 

Please help me and let me know how i can fetch such report.

 

Regards

Nikhil Garg

1 ACCEPTED SOLUTION

Accepted Solutions

RiaanBadenhorst
Moderator
Moderator
Partner    VIP    Accredited Certified

Give this a try

 

bpimmedia -dp DiskPoolName -stype DataDomain -l -legacy | awk '{ if ( $1 == "IMAGE" ) print "bpimagelist -backupid "$4" -l" }'|sh | awk '{ if ( $1 == "IMAGE" ) print $6,$19 }'

 

I hope you have linux/unix :p

View solution in original post

8 REPLIES 8

sdo
Moderator
Moderator
Partner    VIP    Certified

OS family and OS verison and OS patch level of master?

Exact NetBackup version?

sdo
Moderator
Moderator
Partner    VIP    Certified

Try an 'nbdevquery' command to retieve the 'media' name of the DD, which should be something like '@zzzzz', and then use this name of '@zzzzz' in a bpimmedia command to list all images residing upon that media  (i.e. OST device).

rk1074
Level 6
Partner

Go to catalog section of the GUI

Select Disk (Disk Type as OST)and select the Disk pool fo ryour DD. enter data range and all  clients.

This will show you all the backup image son DP.

Now Copy all  the images into a file and run a for loop to check the size of the images using bpimagelist -backupid <BID> -U 

 

Nicolai
Moderator
Moderator
Partner    VIP   

Use a SSH client that offer logging of text e.g. putty or ssh using tee

Then logon to the Data Domain and run command:

filesys show compression /data/col1/{DD storage unit name} recursive no-sync

This will provide the list of all files stored on the data domain. Some of the files are Netbackup meta files, filter those off using grep. Then import to excel to do the data processing.

But please be aware that dedupe statistics are calculated at point of injection, they are not recalculated as other backup are expired or moved.

See also :

Understanding DataDomain Compression

https://community.emc.com/docs/DOC-32028

 

 

 

 

 

RiaanBadenhorst
Moderator
Moderator
Partner    VIP    Accredited Certified

Give this a try

 

bpimmedia -dp DiskPoolName -stype DataDomain -l -legacy | awk '{ if ( $1 == "IMAGE" ) print "bpimagelist -backupid "$4" -l" }'|sh | awk '{ if ( $1 == "IMAGE" ) print $6,$19 }'

 

I hope you have linux/unix :p

Nicolai
Moderator
Moderator
Partner    VIP   

Just one note more.

Go for removing the dirty dozen - those images with the lowest deduplication rates. If you copy off a 100G backup images, it does not release 100G on the data domain if almost all other backup images reference the same data blocks.

nikhilg17
Level 2

Thanks a lot to all of you. what you all sugested, worked for me but easy and best solution was that script so selecting that as a solution. Thanks again :)

Nicolai
Moderator
Moderator
Partner    VIP   

Just one more note - while Riaan script work good for providing front end capacity reporting, it does not say how good or bad the images deduplicate on the data domain.

If you have space constrains, remove those backup images that deduplicate works, not the one that is the largest according to Netbackup. They may actual deduplicate with good ratios.