From: Poulos, Lou (lpoulos~AT~tcunet.com)
Date: Wed Dec 31 2003 - 20:33:53 CET
I'm looking for some tips on how to set this thing up and forget it. I'm having trouble getting my mind around the best way...
I've got a machine that does syslogging for a PIX firewall. I rotate the logs daily, gzipping them. I've got another machine that runs a webserver and fwanalog. I have an expect script that downloads yesterday's gzipped log from the syslog server, changing the name to [yesterday's date].log.gz.
Do I let the files build up in one directory, and run fwanalog against them using the inputfiles_mask of *log.gz and mtime = "1", then just trimming fwanalog.all.log every once in awhile to keep it's growth under control?
Or should I create a directory for each day and just let fwanalog create the report in there?
The reason I'm asking is that I'm confused on how to get weekly reports, I guess. If I tell it to just report against all the log files I collect in a directory, then it will do day 1 one day, then day one AND day 2 the next day, then 1, 2, and 3 the next day, and so on, won't it? Lots of duplicate lines will be created. Is this where I need to use the -t option each day to only update that day's data?
Man, I've been messing with this for so long I can't see the forest for the trees anymore. Anyone have any tips on how to set up an automated, ongoing, cumulative reporting routine?
This archive was generated by hypermail 2.1.5 : Sun Jan 11 2004 - 07:42:04 CET