Next | Query returned 22 messages, browsing 21 to 30 | previous

History of commit frequency

CVS Commit History:


   2013-12-02 09:08:01 by Adam Ciarcinski | Files touched by this commit (2)
Log message:
Changes 5.3:
This release fixes several bugs and adds two new pie charts about the most use \ 
top second level domains. It is also possible to do DNS lookup of Ip addresses \ 
inside SquidAnalyzer, see UseClientDNSName new configuration directive. This can \ 
slow down dramatically the squid-analyzer performances but you can adjust the \ 
DNS lookup timeout to prevent waiting slow DNS server, see DNSLookupTimeout new \ 
configuration directive.

- Update and fix first and second top level domain name.
- Add new directive DNSLookupTimeout to change the default timeout for
  DNS lookup. Add 0.0001 second timeout when SquidAnalyzer look for a DNS
  name and can't find a name server.
- Add pie chart of top second level domains.
- Fix some HTML tag issues and table ordering on Top domain hits and Top
  url hits.
- Update INSTALL file to remove GD::Graph requirements.
- Change underscore used to replace space in user name by the special
  string _SPC_ so that underscore will not be wrongly replaced on HTML
  output.
- Fix pt_BR translation with charset to utf-8 and a few words with
  accentuation fix.
- Allow Ip addresses on user names to be replaced by their DNS name, this
  feature is activated by a new directive: UseClientDNSName.
- Add missing description of --no-year-stat option to documentation and
  squid-analyzer usage.
   2013-09-03 12:08:26 by Adam Ciarcinski | Files touched by this commit (5)
Log message:
Squid proxy native log analyser and reports generator with full statistics
about times, hits, bytes, users, networks, top urls and top domains. Statistic
reports are oriented toward user and bandwidth control; this is not a pure
cache statistics generator.

SquidAnalyzer use flat files to store data and don't need any SQL, SQL Lite or
Berkeley databases.

This log analyzer is incremental and should be run in a daily cron or more
often on huge network trafic.

Next | Query returned 22 messages, browsing 21 to 30 | previous