AWStats
For further information see the AWStats website - http://awstats.sourceforge.net/
Installation
From Repository
- Install the package
apt-get install awstats
- Edit the the generic template config file if required
/etc/awstats/awstats.conf
- Create apache config file for site with contents show below
- eg
/etc/apache2/sites-enabled/awstats
- eg
- Restart apache
service apache2 restart
- Site should now be available via URL similar to
Alias /awstatsclasses "/usr/share/awstats/lib/" Alias /awstats-icon/ "/usr/share/awstats/icon/" Alias /awstatscss "/usr/share/doc/awstats/examples/css" ScriptAlias /awstats/ /usr/lib/cgi-bin/ <Directory /usr/lib/cgi-bin/> Options ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> <Directory /usr/share/awstats/> Order allow,deny Allow from all </Directory>
From Download
- Get the latest version from http://awstats.sourceforge.net/#DOWNLOAD
- Uncompress
- EG
tar -xzvf awstats-7.2.tar.gz
- EG
- If upgrading from a previous version, backup the old version
- EG
mv /usr/local/awstats/ /usr/local/awstats-v7.0
- EG
- Copy the (new) software in
- EG
mv /home/user/awstats-7.2 /usr/local/awstats/
- EG
- Ensure the web folder is executable
- EG
chmod -R a+rx awstats
- EG
Add a Web Site
- Create a specific config file for the site to monitor
cp /etc/awstats/awstats.conf /etc/awstats/awstats.mysite.com.conf
- Edit the config file for the site, specifically (see below for further options)
LogFile=”/path/to/your/domain/access.log”
LogFormat=1
(this will give you more detailed stats)SiteDomain=”mysite.com”
HostAliases=”www.mysite.com localhost 127.0.0.1"
(example for a local site)
- Perform an initial stats gather for the site
/usr/lib/cgi-bin/awstats.pl -config=mysite.com -update
- Test that you can see some stats, using URL similar to
- Add a scheduled job to crontab to update automatically
crontab -e
- EG every 30 mins
*/30 * * * * /bin/perl /usr/lib/cgi-bin/awstats.pl -config=mysite.com -update >/dev/null
Further options
- Wiki sites (and other sites where an URL parameter can specify a specific page)
URLWithQuery=1
- useful for Wiki's etc where query param indicates a different pageURLWithQueryWithOnlyFollowingParameters="title"
- only treats variances in param title as distinct pagesURLReferrerWithQuery=1
follows on from two above
IIS Servers
By default, IIS server logs do not contain enough fields. Make sure the following are included (those in bold are the usually omitted culprits)..
- date
- time
- c-ip
- cs-username
- cs-method
- cs-uri-stem
- cs-uri-query
- sc-status
- sc-bytes
- cs-version
- cs(User-Agent)
- cs(Referer)
To change the above in IIS7, in IIS Manager
- Highlight the webserver (not a web site)
- Double-click the Logging feature
- Click on the Select Fields button
One-off Update
To perform a one-off update from a specific log file...
/usr/lib/cgi-bin/awstats.pl -config=server -LogFile=access.log
- Updates can only be added in chronological order, therefore you may need to delete the data file for a particular month, and rebuild it entirely.
Scheduled updates are configured in /etc/cron.d/awstats
Add New Robots to Detection
Locate the robots.pm
, likely to be in one of the following locations, create a backup and then edit...
./usr/share/awstats/lib/robots.pm
./usr/local/awstats/wwwroot/cgi-bin/lib/robots.pm
Robots need to be added to two arrays, RobotsSearchIDOrder_listx
and RobotsHashIDLib
. RobotsSearchIDOrder_listx
is just an index, so if adding a number of new robots, create your listings for RobotsHashIDLib
, then create a copy and delete the extraneous bits for RobotsSearchIDOrder_listx
.
Add new entries to the end of the arrays, just above the catch-alls.
So for a robot user agent string such as...
(compatible; proximic; +http://www.proximic.com/info/spider.php)
Create a entry such as this for RobotsHashIDLib
'proximic','<a href="http://www.proximic.com/info/spider.php" title="Bot home page [new window]" target="_blank">Proximic</a>',
and an entry such as this for RobotsSearchIDOrder_listx
'proximic',
Note that robots will only be detected on new stats runs, so to update historical stats, delete the history files and recreate from archived logs (if you have available).
Errors
If you get the following error similar to the following after updating the robots file...
Error: Not same number of records of RobotsSearchIDOrder_listx (total is 789 entries) and RobotsHashIDLib (787 entries without 'unknown') in Robots database.
You've probably added a duplicate robot entry, which will need to be removed/merged with the existing entry - search your backup robots.pm file for a robot that you're trying to add.