Monday, April 13, 2015

SharePoint 2013: Using Log Parser to develop usage metrics

Introduction

Microsoft LogParser provides the SharePoint admnistrator with significantly greater usage analysis capabilities than those previously offered in SharePoint 2010 and much much more so than offered in 2013, which are minimal and problematic.  This posting captures the steps undertaken in analyzing simple website usage metrics using Microsoft Log Parser.  Specifically, it captures the steps for generating usage graphs based upon IIS data, including:
  • Daily Total User Requests (Page Hits)
Procedure
  1. Collate farm configuration information
    1. Identify all farm web servers
      • For this farm, there are two: WFE1 and WFE2.
    2. Identify IIS log settings all web servers
      • IIS Manager > Select web application > Double-click Logging
      • Schedule: Daily
      • Format: W3C
    3. Identify IIS Log Folders all web servers
      • By default, these are here: C:\inetpub\logs\LogFiles.
      • Each web application will have its own folder, distinguished by unique IDs. These IDs among the farm web servers will match: the name of a web application log folder on WFE1 will be the same as for that same web application on WFE2. For example, given a two-WFE farm, having two web applications (one for My Sites and one for the regular sites), the folder on both web servers for the My Sites logging might be W3SVC1128939604 and for the standard sites W3SVC1509962310. Note the prefix for each of these: W3SVC.
    4. Identify the following service accounts:
      1. Farm service account (eg, spFarm)
      2. Setup User Administrator (eg, spAdmin)
      3. Search content crawl (eg, spContent, spCrawl or spSearch)
        Note down these accounts, both in their claims-aware format (eg, 0#.w|DOMAIN\Account) and their standard AD format (eg, DOMAIN\Account).  Both forms may appear in the IIS logs and both would need to be filtered out so as to focus on actual user metrics.
  2. Prepare Analysis Workstation
    1. Copy the Log Parser 2.2 binaries to some folder.
    2. Add path to Log Parser to DOS PATH environment variable.
    3. Create Analysis Folder on local workstation
      1. Off root drive to keep path simple.
      2. Create dump subfolders for each web server, WFE1 and WFE2: This is needed for storing the log files to be analyzed.  The log names files for each web server are not uniquely identified by server.  This means that the naming used by, say WFE1, will be the same as the naming used by WFE2.  Thus, if you co-mingle these files, they will overwrite each other.
    4. Copy IIS logs to analysis folder
      1. Copy logs from WFE1 to the WFE1 subfolder and append WFE1 to each log file.  Repeat for WFE2.  These two folders will eventually be used to analyze NLB.
      2. Create another folder under Analysis and copy ALL renamed logs into this single folder.  This will be used for bulk analysis.
    5. Break out logs by year and month
      1. Within the Analysis folder, created folder structure by year and then subfolders by month.
      2. Copy IIS logs to appropriate folders: this is a third copy of the logs. 
        Note: I've organized the log files in this manner so as to facilitate analyses using Log Parser.  It was the most rapid way I could think of at the time for building the metrics that I wanted by month.
  3. Build Daily Total User Requests Data
    1. Build site list
      1. Field: Date, Type: Date (only, standard)
      2. Field: Requests, Type: Number (0)
      3. Field: DailyUniqueUsers, Type: Number (0)
      4. Field: MonthlyUniqueUsers, Type: Number(0)
      5. Field: YearlyUniqueUsers, Type: Number(0)
    2. Open a command shell
    3. Navigate to a folder containing all of the log files for a month (and containing the log files from all WFEs for that month).
    4. Run these scripts and in this order:
      1. logparser -i:IISW3C -o:CSV "select count(*) as ct,cs-username,date from *.log where sc-status<>401 AND cs-username<>'0#.w|DOMAIN\spContent' AND cs-username<>'0#.w|DOMAIN\spAdmin' AND cs-username<>'' AND cs-username<>'DOMAIN\spContent' AND cs-username<>'DOMAIN\spAdmin' group by cs-username,date order by date" -q >dateout.csv
        Note: Tailor the where clause to filter out those service accounts generating a lot of hits but that do not generate meaningful usage data from a user usage perspective.  Same for the setup user administrator account, which is used by the farm administrator to administer configuration.

        Note: Some users, who leave a browser connected to SharePoint for long periods, may generate anomalously large hits. For example, on one day during the entire month, one user generated 547,233 hits, almost all of it to [site]/
        vti_bin/sites.asmx.  Filtering out such results reduced the total hit count for that day to a level consistent with previous results.
      2. logparser -i:CSV -o:csv "select date,sum(ct) as TOTALREQUESTS,count(*) as UNIQUEUSERS from dateout.csv group by date" -q > dailycountsdateo.csv
    5. Copy contents of dailycountsdateo.csv to site list.
    6. Repeat for each month until all desired months have been analyzed.
    7. Once you have a list of dates and daily hit totals, present the data graphically via an Image web part:
      .
References
Notes
  • Claims-based authentication complications: I found that some users appeared in the IIS logs both under their claims-based authentication (eg, 0#.w|DOMAIN\Adam.Smith) and under their standard AD authentication (eg, DOMAIN\Adam.Smith).  I don't know why this is.  This complicated attempting to determine unique user numbers via LogParser: a single user would appear to be two different users, since Log Parser compares characters, not usernames.  To workaround this, I performed these steps:
    1. Used LogParser to generate a listing of all users. 
    2. In Excel, removed the prefixes to reduce the listing to just usernames (ie, removed 0#.w|DOMAIN\ and DOMAIN\.
    3. Used Excel advanced functionality to find unique entries.
    4. Counted unique entries.
  • Use "NULL" when filtering for values in the IIS web log that are blank. For example, to filter out all entries having username = NULL, add this statement to the WHERE clause:
    cs-username<>NULL 
  • How to write output to a file.  There are two ways in which to have LogParser output written to a file.  The first method involves using the "into" clause.  So, for example, if you had a LogParser statement that you wanted to execute, like this one:
    logparser -i:IISW3C "select count(*) as ct,EXTRACT_TOKEN(cs(user-agent),0,'/') as agent from *.log group by agent order by ct desc"
    you would do this
    logparser logparser -i:IISW3C "select count(*) as ct,EXTRACT_TOKEN(cs(user-agent),0,'/') as agent into MyFile.txt from *.log group by agent order by ct desc"
    Note the placement: just before the from clause.  Using the into method writes the output to file exactly as you see it in the command shell.  The other way is to use the "-q" option, like so:
    logparser -i:IISW3C "select count(*) as ct,EXTRACT_TOKEN(cs(user-agent),0,'/') as agent from *.log group by agent order by ct desc" -q > MyFile.txt
    Note that it comes after the closing quotation mark.  Using the -q method writes a clean columnar output to the file but without any headers. 
  • Renaming files in bulk:
    Get-ChildItem -Filter "old" -Recurse | Rename-Item -NewName {$_.name -replace 'old','new' }

No comments: