Do you have a question? Post it now! No Registration Necessary. Now with pictures!
- Posted on
June 29, 2006, 7:21 am
rate this thread
I have created an app a few years back to store some records in a DB.
According our calculations we were never to exceed 500,000 records in
the DB. Seems we were off by a decimal point or so.
I set up a FreeBSD box with MySQL three years back and it has been
filling up. One table has over 4,000,000 records. Yes, four million.
As a web developer and not a DBA, I have struggled to upkeep the server
the best I can. As of the last one million records the server has been
struggling to keep up with multiple requests and as you can imagine the
user base is growing too.
Dual Xeon 3.06 Ghz
4 GB ECC RAM
800GB RAID5 SATA array
Basically I have to perform a search on one of two columns in this huge
table (10 columns, 4 million rows).
The table is MyISAM with a single primary key that is used largely for
updating row data.
Most records are ten digit numbers for one column and a ten digit
varchar for the other, but sometimes either column can be a series of
characters up to 100 chars long, so each column is set for
The action performed is
SELECT count(*) FROM tableName WHERE col LIKE "%123%"
SELECT * FROM tableName WHERE col LIKE "%123%" LIMIT 0,25
with "%123%" being any random string typed into a search window.
There has to be better solution! All you geniuses out there, UNITE!
Thanks in advance, Alex.
Re: Slow Queries, 4 million records, need educated advice!
Using a pattern like that cannot use indexes to do the search. So it
has to physically scan all 4 million rows to find those that match the
It's hard to give specific advice without seeing your schema or
understanding the nature of searches you need to do. Have you
considered using FULLTEXT indexes? Read about them here:
- » Americanas.com SALDO de TVs Com at 80% de Desconto (29798)
- — Newest thread in » MySQL Database Forum