Showing results for 
Search instead for 
Did you mean: 


The attached UNIX bash script "media_used_by_clients.txt" is a script I wrote so we could track by server/client what media was used . This gives us a daily history on a per server basis for what tapes are needed in the event of a disaster.

We have this setup in a crontab entry on our master servers. A suggested crontab entry would be:

10 10 * * * if [ -d /usr/openv/netbackup/db ]; then /usr/openv/nbu_scripts/media_used_by_clients > /dev/null 2>&1; fi

This crontab entry ensures that the current server is the master (useful for those with clustered masters). I place my custom scripts in the "/usr/openv/nbu_scripts/" directory ... you can place the script some place else.
Thanks for sharing I have it enabled already.  I'm using this on linux and get an error on line 13.  What is the 'line' command supposed to do in this script because I think that's what's giving me an error?  (line>/dev/null ;line>/dev/null;tee)

[root@netbackup bin]# ./
./ line 13: line: command not found
./ line 13: line: command not found
no entity was found
no entity was found
[root@netbackup bin]#
The "line>/dev/null; line>/dev/null"  just sends the 1st and 2nd lines of the report to the trash so they are not processed by the rest of the script. That would be where the 2 errors in line 13 are coming from. 

Just replace line with read if you're running on Linux under a bash shell.  :)

Hey Don...............


Check out these for loops  ::--->

Which Tapes the clients have been written too in the past 24hrs ::--->

for i in `$ADMINCMD/bpclclients -allunique -noheader | awk '{print $3}'`
echo "$i \c"
$ADMINCMD/bpimagelist -media -U -client $i -hoursago 24 -idonly 2>&1|nawk 'BEGIN{RS="="} $1=$1'


All Tapes associated to which clients ::--->

for i in `$ADMINCMD/bpclclients -allunique -noheader | awk '{print $3}'`
echo "$i \c"
$ADMINCMD/bpimmedia -U -client $i 2>&1|sed '1,5d'|cut -c89-95|sort|uniq|nawk 'BEGIN {RS="="} $1=$1'

Joe Despres

Thanks for the suggestions Joe,

I run this by creating a list of clients that I want to know what tapes were used:


server_list=`grep -v '^#' ${app_path}/server.list`

for f in ${server_list};
do {
        echo " " >> ${report}
        echo "Media used for Exchange Mailbox Backups -- ${f} on ${today}" >> ${report}
        /usr/openv/netbackup/bin/admincmd/bpimagelist -U -d ${today} -e ${today} -client ${f} -media -pt MS-Exchange-Server >> ${report}
        echo " " >> ${report}


the "pt MS-Exchange-Server" gets me only those kinds of backups.

pt = Policy Type

Don Wilder

Hi Don

I was looking for a script which can help me to fulfil my client's requirement. Got your post but couldn't make out the script completely. Please help me to get one. Below is my requirement:

My client need to run tape written report on every Thursday 10:00 AM counting from last Thursday 10:00 AM. Generally we get 50+ tapes as output. Next, We check these tapes details in TLD and copy data expiration for all those 50+ tapes manually and create one excel sheet where we mention tape detail along with expiration date and send these details to them. Please help me to get a script where we can get these details in mail every week rather doing manually ? We have master server is linux and we have 4 TLDs and we need these report only for TLD(3). Moreover, it will be great help if after getting list we are able to eject 45 tapes at one because CAP capacity is only 45 tapes.


Thanks in Advance,

Nitesh Kumar


Is there any scripts for windows master server Netbackup Report Script to determine media used by clients? Thanks. 

I am not a Powershell script writer unfortunately. I wrote this script several years ago and still use it today, but on a UNIX master. I have always considered re-writing it for windows use, but never got around to it. It should be fairly simple to write, but you will need some external applications other than the stock netbackup ones. Such as an email program like blat to get the reports sent off.

There also may be a way to do this function in OpsCenter... BUT as I already have this script have never seen the need to investigate further.

If I do get a windows version up and running it will be posted here also.