This is log_acquire. How the clients collect logs and transfer the logs to the logging server. ============================================================= Client: Requirements ------------------------------------------------------------- 1. Generate logs that represent a comprehensive monitoring of the client machine. A. Logs generated by the 'syslog' facility B. Logs generated by TCPd and its tcpwrappers C. Logs generated (where applicable) from DNS (named) activity D. Logs cut from the wtmp, utmp and w utilities E. (Future) Logs generated by process accounting F. (Future) Other... 2. Collect the logs into a central repository --This is being accomplished through configuring the various logging facilities to dump their respective logs in a central repository. In other words, make sure all logs are dumped to a single directory. 3. Transfer the collected logs to a central logging server on a regular basis. Initially, this is every 4 hours. --This is being accoomplished by running 'archive_logs' as a cron job every 4 hour 5. Network access should be timed to avoid congestion. --Cron is configured to satisfy this ------------------------------------------------------------- Client: Directory structure: (The same directory structure exists on the server. Essentially, the server will use the NFS to connect to itself.) /var/log/log_export: Temporary directory to hold cut logs prior to transfer to the logging server. /var/log/log_srvr_mnt The mount point for the logging server. Note: There should be a 'README' in each directory to discourage deleting the directory accidentally. ============================================================= Server: Directory structure: Log Transfer ------------------------------------------------------------- (The following directories exist on the server in addition to the above) /var/log/net_logs_dump (There is a directory under this one for each client machine) Note: There should be a 'README' in each directory to discourage deleting the directory accidentally. ************************************************************* * * Server-Side log processing * ************************************************************* Requirements: 1. Be able to deal with compressed files as well as normal files. 2. Filter the logs for duplicate entries and retain only one. 3. Collate the logs so each file contains a certain log covering a fixed length of time. This may be accomplished by using sed/awk/perl or by using the dbms to hold lines for comparison. 4. Create a file of log information formatted suitably for importing into a MySQL database using a suitable interface. This may be accomplished by using sed/awk/perl or by using the dbms to hold lines for comparison. 5. Import the log data into the MySQL dbms.