Static pages vs CMS and the effect on SE's

Do you have a question? Post it now! No Registration Necessary.  Now with pictures!

Threaded View
Hi everyone,

My site is a little over 400 static pages in size and keeping the layout and
menus up to date is starting to become a major pain in the bottom.

I've been thinking about using a CMS to ease the burden somewhat.  The
question, should I keep using my static links or is it ok to switch and pull
the content from a db?

I would like to retain my current positions within search engines if
possible.  The address created by the CMS has an id-tag in it... I read
somewhere this is not something you would like to have.  Also, to my
knowledge, using generated links does not allow me to use keywords in the

On the one hand I need to easy the worload on the other I wish to please the

Pro's and contras are welcome.


Re: Static pages vs CMS and the effect on SE's

Quoted text here. Click to load it

If you do it right, Google can't see if it's dynamic or static.

The most important question though: how often are you going to update
the DB. If that's not that often a much better solution is to *generate*
your site locally, and upload all pages.

It's what I do (with over twice the amount of pages).

Quoted text here. Click to load it

Can't think up why. Check if your CMS has SEO friendly URLs. Some do
have mods/hacks to get it.

Quoted text here. Click to load it

Not true. I generate, and I do have keywords in the URL.

Quoted text here. Click to load it

The major con of a database if you really don't need it is that your
site hangs on two servers, the db server and the webserver. It increases
the risc of downtime. If your db isn't there, your site is gone.

In fact you have 3 options:

- generate each page on the fly from the database ("dynamic")

- check if the data in the page is more recent compared to the current
  file. If you can't connect to the database: use the current file
  If you can connect to the database, and the data is more recent,
  serve this data, and update the file.

- generate your entire site locally using scripts, and upload the

If you go for the database approach, the 2nd option is the best [1],
*but* quite complicated.

I use the 3rd version: I have defined my own markup language using XML.
My site consists of several XML files (one file can include zero or more
other XML files). A perl script parses this file (twice, once to get all
the ids, the second time to generate the HTML pages).

A second program compares each file with a local copy of my site. If the
generated file is newer compared to the copy, the file is uploaded, and
the copy is updated (so I don't upload a thousand+ files each time).

The disadvantage is that it limits me somewhat. For example it's not
easy to integrate something like instantly publish comments posted by
people who registered with my site, and have a trusted status (future

[1] unless your database changes extremely fast and most requests do
require a page update, and reporting the database down is more desired
compared to showing old data.

John             Experienced (web) developer:
                           Perl SEO tools:
NEW ----> Textpad reference card (pdf):

Site Timeline