Showing results for 
Search instead for 
Did you mean: 
Partner    VIP   

There are some tools out there that are just must-have, but you never get to know anything about them.  One of those tools is Log Parser, a tool developed by Microsoft which you can use to analyse log files… particularly, for me, IIS log files on Enterprise Vault servers.


One of the things that I wanted to do with some customer data recently was to count the number of occurrences per hour of a particular web page.  Messing about in Excel having imported the whole of the file made it possible, using filters and so on, but I couldn’t help but thinking that there must be a better way.  Well, there is .. it’s Log Parser.  This article will give you a short walk through of some of the possibilities of using Log Parser.  Try them in your own lab environment, and then work on some copies of log files from your production machines.

Where do I get it from?

You can download log parser from Microsoft.  It is only 1.5 Mb in size and once installed there are some samples which are well worth reviewing.

What can I do with it?

You can build up simple SQL queries against a log file or a series of log files.  There are some tools out on the internet which will help you build these queries, but they’re not particular hard to do just using trial and error on the command line.  There are also a bunch of samples which are installed by Log Parser – they are well worth a look at.

Task Zero

One of the best things to do is to make sure that the Log Parser binary is in your DOS PATH environment – so you can run the binary from anywhere on your machine.

My first task

My task was to use log parser to look at an IIS log and get the number of requests to a certain page from a specific user ‘alberto’.

This is the command that I used :

logparser -i:iisw3c "select time, s-ip, cs-method, cs-uri-stem, cs-uri-query,  sc-status, cs(User-Agent) from ex110427.log where cs-uri-stem like '%download.asp%' and cs-username like '%alberto%'" -q on

What this does is tell log parser that the file is to be treated as an IIS log file, then you have the SQL statement, and finally I turned “quiet mode” on.  The SQL query is just that – a select statement to get the desired fields, with the specified criteria.  The criteria has a couple of interesting bits to it :

First the “FROM” part of the query is where you specify the file(s) you want the query to run against.  In my first example it’s just one log file called ex110427.log.

Second the “WHERE” part of the query is just back to being regular SQL type syntax, where you filter down the results.

Part of the output :

09:49:47 GET /EnterpriseVault/download.
09:49:51 GET /EnterpriseVault/download.
09:49:54 GET /EnterpriseVault/download.
10:07:20 GET /EnterpriseVault/download.
10:07:24 GET /EnterpriseVault/download.
10:07:26 GET /EnterpriseVault/download.
10:07:47 GET /EnterpriseVault/download.
10:07:50 GET /EnterpriseVault/download.
10:07:52 GET /EnterpriseVault/download.
10:13:43 GET /EnterpriseVault/download.
10:13:45 GET /EnterpriseVault/download.
10:16:20 GET /EnterpriseVault/download.
10:16:56 GET /EnterpriseVault/download.
10:17:00 GET /EnterpriseVault/download.
10:17:06 GET /EnterpriseVault/download.
10:18:41 GET /EnterpriseVault/download.
10:25:34 GET /EnterpriseVault/download.
10:25:39 GET /EnterpriseVault/download.
10:25:44 GET /EnterpriseVault/download.
10:26:14 GET /EnterpriseVault/download.
10:26:17 GET /EnterpriseVault/download.
10:26:21 GET /EnterpriseVault/download.

My second task

My second task was to refine the query from before, but to list the number of hits to download.asp by ‘alberto’ by hour.

This is the query that I used :

logparser -i:iisw3c "select TO_STRING(time, 'HH') as hour, count(*) as Hits from ex110427.log where cs-uri-stem like '%download.asp%' and cs-username like '%alberto%' group by Hour" -q on

You can see with this one that we’re doing standard SQL stuff and building a variable called “hits”, and doing grouping to get the information we want to see.

Here is an example of the output on a simple file :

09 3
10 19
12 1

My third task

My third task was to query all users hit count against download.asp over a day.  The query for that is build on the second query above :

logparser -i:iisw3c "select TO_STRING(time, 'HH') as hour, count(*) as Hits from iis_ex110214.log where cs-uri-stem like '%download.asp%' group by Hour" -q on

This, for now, is exactly what I wanted to see.  I can quickly see busy, and quiet times :

00 625
01 3434
02 1814
03 2832

04 783
05 610
06 1255
07 1603
08 780
09 1160
10 386
11 534
12 94
13 105
14 62
15 72
16 53
17 66
18 86
19 17
20 5
21 4
22 7
23 89


My Fourth Task

My fourth task is to see how much data is pulled down with those download requests.  This is a simple extension of the previous query :

logparser -i:iisw3c "select TO_STRING(time, 'HH') as hour, count(*) as Hits, sum(DIV(sc-bytes, 1024)) as [Kb Received] from iis_ex110215.log where cs-uri-stem like '%download.asp%' group by Hour" -q on

00 134  34359
01 1035 259501
02 1331 288670
03 2051 494134
04 549  156736
05 1062 246516
06 732  241699
07 836  243887
08 652  181973
09 801  218384
10 1122 305673
11 521  126194
12 604  91505
13 632  104211
14 127  30332

You can see that I have added sum(DIV(sc-bytes,1024)) to give me KB Received.  You can go one further to get Mb Received, ie : sum(DIV(DIV(sc-bytes, 1024), 1024). 

My Fifth Task

My fifth task is to see how many “uploads” are happening on to the Enterprise Vault Server.  What I mean by this is people who drag and drop items to Virtual Vault, and then either wait for a synchronisation to take place, or manually initiate a synchronise event.

Here is the query that I used to find that out :

logparser -i:iisw3c "select TO_STRING(time, 'HH') as hour, count(*) as Hits from iis_ex110215.log where cs-uri-stem like '%uploaditem.aspx%' group by Hour" -q on

Here is some output :

00 231
01 1586
02 1365
03 561
04 2669
05 4332
06 1083
07 206
08 702
09 448
10 2241
11 518
12 938
13 297
14 215

Again you can quickly see where the busy patches are.


Hopefully you can see how you can take the beginnings of a simple idea, and build up a query quite quickly which will give you powerful results, and a good insight in to what your Enterprise Vault server is doing.

Follow On

Hopefully looking at the IIS logs on my Enterprise Vault server will give you some ideas of what data you want to see out of the log files in your environment.  There is a whole series of things that you can do, and of course charts/HTML, automatic emailing of the results daily/weekly are further extensions to these ideas that you can do.


References :

Level 6
Partner Accredited Certified

Awesome Rob!

Partner    VIP   

Glad you like it.

Version history
Last update:
‎05-09-2011 02:58 AM
Updated by: