Batch uploading of files

Do you have a question? Post it now! No Registration Necessary.  Now with pictures!

I could do with a pointer in the right direction here... I have two web
sites, hosted on different servers. They share a database in common. This
is an article database - I make all additions and changes to the database
on server A and the database is copied each night to server B. It also gets
copied to my local (development) machine (running Linux) - the one running
the cron job, so I always have up-to-date data to work with. That all works
fine. But I also want to do something similar with the image files to
accompany the articles.

I'm half-way there: each night, a cron job on my local machine downloads the
images from server A. It uses wget with the -N switch, which means that
only new or updated files are actually copied. At that point I have a full
set of the files on my local machine. I now want to upload them to server

I use ncftpput for uploading batches of files where I know they are all new.
But I can't see a switch for bcftpput that will say "don't bother uploading
if the file's already there", and I don't want to upload all the files
every time.

Any thoughts?

Site Timeline