Do you have a question? Post it now! No Registration Necessary. Now with pictures!
- ton de w
September 19, 2006, 7:50 pm
rate this thread
I am having some problems with an application that uses mysql 4.1.3 to
stash away lots of data. About 100k records I think. It seems as though
after the upload to an empty database reaches a certain point (or maybe
it's the throughput), the records get truncated and misinterpreted.
I had thought that it was the application which did not have all it's
turnips on the trolley.
But now I wonder if the database cannot take the punishment.
The application and the database are on a 2 processor Sun box that
isn't even breathing hard.
Any guidance on how to look into this?
Re: How big is big? Too big?
100 000 rows? Some people wouldn't even call that a database. The first
limit you should encounter is 2 GB filesize, if you are using a
filesystem that doesn't allow files bigger than that. If you don't have
such limit, the next should be 4 GB.
Try to find out what row is getting misinterpreted and try inserting
that row alone ( or that row and few before and after it) to see if it
still gets misinterpreted. It that doesn't help, try removing half of
the data to see if the rest is inserted okay. The main goal should be to
create as small test material as possible, that still causes the
problem. The smaller the material is, the easier it will be to find the
- » Okay, this one might qualify as "Advanced" joining...
- — Previous thread in » MySQL Database Forum
- » Americanas.com SALDO de TVs Com at 80% de Desconto (29798)
- — Newest thread in » MySQL Database Forum