From vwiki
Jump to navigation Jump to search

For further information see the AWStats website -


From Repository

  1. Install the package
    • apt-get install awstats
  2. Edit the the generic template config file if required
    • /etc/awstats/awstats.conf
  3. Create apache config file for site with contents show below
    • eg /etc/apache2/sites-enabled/awstats
  4. Restart apache
    • service apache2 restart
  5. Site should now be available via URL similar to
Alias /awstatsclasses "/usr/share/awstats/lib/"
Alias /awstats-icon/ "/usr/share/awstats/icon/"
Alias /awstatscss "/usr/share/doc/awstats/examples/css"
ScriptAlias /awstats/ /usr/lib/cgi-bin/

<Directory /usr/lib/cgi-bin/>
        Options ExecCGI -MultiViews +SymLinksIfOwnerMatch
        Order allow,deny
        Allow from all

<Directory /usr/share/awstats/>
        Order allow,deny
        Allow from all

From Download

  1. Get the latest version from
  2. Uncompress
    • EG tar -xzvf awstats-7.2.tar.gz
  3. If upgrading from a previous version, backup the old version
    • EG mv /usr/local/awstats/ /usr/local/awstats-v7.0
  4. Copy the (new) software in
    • EG mv /home/user/awstats-7.2 /usr/local/awstats/
  5. Ensure the web folder is executable
    • EG chmod -R a+rx awstats

Add a Web Site

  1. Create a specific config file for the site to monitor
    • cp /etc/awstats/awstats.conf /etc/awstats/
  2. Edit the config file for the site, specifically (see below for further options)
    • LogFile=”/path/to/your/domain/access.log”
    • LogFormat=1 (this will give you more detailed stats)
    • SiteDomain=””
    • HostAliases=” localhost" (example for a local site)
  3. Perform an initial stats gather for the site
    • /usr/lib/cgi-bin/ -update
  4. Test that you can see some stats, using URL similar to
  5. Add a scheduled job to crontab to update automatically
    • crontab -e
    • EG every 30 mins */30 * * * * /bin/perl /usr/lib/cgi-bin/ -update >/dev/null

Further options

  • Wiki sites (and other sites where an URL parameter can specify a specific page)
    • URLWithQuery=1 - useful for Wiki's etc where query param indicates a different page
    • URLWithQueryWithOnlyFollowingParameters="title" - only treats variances in param title as distinct pages
    • URLReferrerWithQuery=1 follows on from two above

IIS Servers

By default, IIS server logs do not contain enough fields. Make sure the following are included (those in bold are the usually omitted culprits)..

  • date
  • time
  • c-ip
  • cs-username
  • cs-method
  • cs-uri-stem
  • cs-uri-query
  • sc-status
  • sc-bytes
  • cs-version
  • cs(User-Agent)
  • cs(Referer)

To change the above in IIS7, in IIS Manager

  1. Highlight the webserver (not a web site)
  2. Double-click the Logging feature
  3. Click on the Select Fields button

One-off Update

To perform a one-off update from a specific log file...

  • /usr/lib/cgi-bin/ -config=server -LogFile=access.log
    • Updates can only be added in chronological order, therefore you may need to delete the data file for a particular month, and rebuild it entirely.

Scheduled updates are configured in /etc/cron.d/awstats

Add New Robots to Detection

Locate the, likely to be in one of the following locations, create a backup and then edit...

  • ./usr/share/awstats/lib/
  • ./usr/local/awstats/wwwroot/cgi-bin/lib/

Robots need to be added to two arrays, RobotsSearchIDOrder_listx and RobotsHashIDLib. RobotsSearchIDOrder_listx is just an index, so if adding a number of new robots, create your listings for RobotsHashIDLib, then create a copy and delete the extraneous bits for RobotsSearchIDOrder_listx.

Add new entries to the end of the arrays, just above the catch-alls.

So for a robot user agent string such as...

(compatible; proximic; +

Create a entry such as this for RobotsHashIDLib

'proximic','<a href="" title="Bot home page [new window]" target="_blank">Proximic</a>',

and an entry such as this for RobotsSearchIDOrder_listx


Note that robots will only be detected on new stats runs, so to update historical stats, delete the history files and recreate from archived logs (if you have available).


If you get the following error similar to the following after updating the robots file...

  • Error: Not same number of records of RobotsSearchIDOrder_listx (total is 789 entries) and RobotsHashIDLib (787 entries without 'unknown') in Robots database.

You've probably added a duplicate robot entry, which will need to be removed/merged with the existing entry - search your backup file for a robot that you're trying to add.