- Learn Linux
- Learn Electronics
- Raspberry Pi
- LPI certification
- News & Reviews
10 June 2005
As as webmaster it's interesting to know what people are actually coming to your site for. I've tried a couple of web log analysis programs in the past, but these have either been too complex to setup (e.g. first put your log files into a DB an then pull the analysis from there), or lacking in features (e.g. where a script uses a session key it lists every single session / page as an individual entry).
I've therefore written my own perl (plus a little php) based log analysis tool that can read in the apache access and errror log files and build a summary of information useful to webmasters. It is fairly simple to install, and does not require any changes to the log files (unless you have virtual hosts you want to seperate into different reports). If provided with details of the scripts / cgi programs on your website it will remove any session information to give you a more useful interpretation of what traffic is coming to your websites. It also filters out robots, and gives and idea of what sites are sending visitors to your site. It's provided as open source software using the GPL. You can use the software for free and have access to the source code, but please excuse some of the coding techniques used as it has been created as a quick solution (or even better let me have your suggestions on how it can be changed to work better). It can be configured to run automatically and allows you to view the reports through any webbrowser.
Whilst designed and tested for Linux, it will also run on any other unix operating system and it should even run for webservers running on windows as long as perl (and optionally php) is installed.