Weekly Mass Data Relocation

Do you have a question? Post it now! No Registration Necessary.  Now with pictures!

Threaded View
I've got a table located on my website and localhost.  The table
maintains a size of about 200MB.  The table holds a running 4 weeks of
data (50MB/week).  I have a winapp that updates my localhost db each
week and currently, I run a backup on the table using the MySql
Administrator...  then I run restore on my website's mysql server.
This deletes the whole table and reinserts each row.  This isn't needed
and is a long process.
Is there a way to select a range of rows from the localhost table and
"migrate" those rows to my website's mysql server without following the
process I just described?  Thanks in advance!


Re: Weekly Mass Data Relocation

Bac2Day1 wrote:
Quoted text here. Click to load it

mysqldump has an option "--where" which allows you to specify a
condition for selecting a subset of rows.  I assume this is to be used
when you specify a single table when backing up.

See http://dev.mysql.com/doc/refman/5.0/en/mysqldump.html

Unfortunately, there does not seem to be a GUI for specifying this
option in MySQL Administrator's Backup screen.  I'd recommend writing a
BAT script to run the backup.  For example, if your records have a
column called `data_entry_date` which records the date the row was created:

mysqldump --no-create-db --no-create-info --extended-insert
--complete-insert --where='data_entry_date > CURDATE() - INTERVAL 4
WEEK' database_name table_name > four_week_dump.sql

Note the use of the --no-create-* options, which omit the CREATE
DATABASE and CREATE TABLE statements from the dump output.

Bill K.

Re: Weekly Mass Data Relocation

Would an incremental solution be better?

Keep 5 weeks of data on local  (4 previous + current week)

Each day:

mysqldump --no-create-db --no-create-info --extended-insert
--complete-insert --where='data_entry_date between CURDATE() - INTERVAL
DAY and CURDATE()' database_name table_name > 1_dump.sql
## gets yesterdays data

ftp to web server and apply plus:
delete from table_name where data_entry_date < CURDATE() - interval 4

You will incur no downtime by having to truncate the table and reinsert
all 200MB. Your processing will be much more efficient.

Depending on your version you may have to use INTERVAL 30 DAY

Site Timeline