Do you have a question? Post it now! No Registration Necessary. Now with pictures!
- Posted on
July 11, 2006, 5:10 pm
rate this thread
I have a table of a million records and wrote a CGI-PERL script to
display the results based on the user input. The results might be
anywhere from 100 to 1000 per query and presently I am displaying them
as 25 results per page.
Problem: Each query is taking about 20-30 seconds.
My solution: I have tried to optimize the table and also index the
table. I have actually converted a MS access database to SQL database,
so it wasn't previously indexed. Both optimization and indexing doesn't
give any good results. I always get a timeout. ie. it takes longer
after indexing and optimizing.
1. I was wondering if someone has a creative solution for this. ie.
reduce the time from 20-30 seconds to atleast 10 seconds.
2. I have links of pages of results beneath the first page result. When
each of these links are clicked it takes 20-30 seconds again. Is there
a way I can reduce the time taken for the subsequent pages are reduced?
I cannot use the LIMIT option in mysql, since I have a where clause
which has to search through the whole table. I tried using views and
using limits, but it takes as much time.
Please let me know.
Re: Results in multiple pages. Takes too much time
Post some code, it seems to me that you may be doing something wrong,
wrong, wrong... I'm assuming you are using DBI, just post the execute
and fetch part of the code.
The results might be
Could be for non indexed... What do the indexes look like and what
does your SQL look like?
Sure you can, limit has a "start" and an "end" value. Frankly I don't
know how they do this in SQL Server where you only have top.
since I have a where clause
- » Americanas.com SALDO de TVs Com at 80% de Desconto (29798)
- — Newest thread in » MySQL Database Forum
- » ssh on command line: force using a group size (prime size) of 1024 (and no...
- — The site's Newest Thread. Posted in » Secure Shell Forum