[prev in list] [next in list] [prev in thread] [next in thread] 

List:       php-db
Subject:    [PHP-DB] Archiving A Database
From:       "Josh Johnson" <mailing_lists () allusiveimage ! com>
Date:       2002-10-30 2:19:38
[Download RAW message or body]

Does anyone know of speedy methods of periodically archiving and
retrieving a database table? I've got a table I'm using to log
statistical data, and it's getting 10k+ entries per day. Entries are
only logged in the database if the user is authenticated for access, and
only once per access session (the user is authenticated, a cookie is
set, and the entry is logged. From that point the user is authenticated
by verifying the cookie, which expires when the user's browser closes)

The data is critical (used for billing and such), but the log viewing
script I've written runs slower and slower with each passing day, and at
one point the entire table was corrupted due to neglect on our system
admin's part (I warned them about the default setting of
mysql_max_connections being too low for a site that relied on the DB so
heavily and had so much traffic.). The script is a simple front end to
several MySQL queries, that retrieve totals for the past hour, day,
week, month and grand totals for all clients or a given client. The rest
of the site has been unaffected by the log issues.

My current approach is to log the data as XML, one record per file, and
create a log file that is an index of refrences to each record. I then
empty previous data n the database. This works, but is very hard/slow to
retrieve specific groups of information, and dumping the database takes
a long time as well, which adds the difficulty of ensuring no records
are lost. I also haven't yet tackled the issue of combining data from
the XML seamlessly with live data from the database.

I'm opening the floor to other potential solutions, or war stories from
anyone else who's been in this situation before. If there's another
approach aside from writing individual flat files, please let me know. I
haven't tested it yet, but I think maybe logging all entries/data for a
given log period into one file would speed things up, but I worry about
the 40 megabyte files that could be potentially created. I'm also open
to complete alternatives, since the main use for this data is to
generate total accesses per a given factor (client, ip, referrer, etc). 

Any help is greatly appreciated. I can give any information needed, just
ask!

Thanks, 
-- Josh

P.S. I did suggest that this issue be tackled when I first installed the
system that creates the log, but management didn't think it was worth
bothering with at the time. 

P.P.S. I did look through the archives before I posted, and didn't find
anything. Please flame in private if I missed something!



-- 
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic