Do you have a question? Post it now! No Registration Necessary. Now with pictures!
- Posted on
You're welcome :-)
Yup. I generate my feed with a Perl script. I just put in a text file the
URL, and it extracts from the page (locally stored), the title and the h1,
which it uses as title, and description of the feed. It seems to work :-)
So when I update my site, I just copy/paste the URLs, run the script, and a
fresh feed is ready to be uploaded :-)
There is also an RSS module, but haven't had a good look at it. Anyway, the
bare bone version I use was not that hard to generate.
John Perl SEO tools: http://johnbokma.com/perl/
or have them custom made
Experienced (web) developer: http://castleamber.com/
October 10, 2005, 4:09 pm
Charles Sweeney wrote:
Wize words indeed. If you subscribe to a lot of blogs, you'll notice
people blog about the exact same website they've all found on each
others blogs, so you get quite a lot of duplication of data.
It seems nobody can write a proper RSS client, without exception they
all seem to hammer your webserver for the rss feed, so you're hit
reports are going to get quite distorted.
Not really. From traditional-SEO POV I've noticed all my blog content
appears in http://blogsearch.google.com/ not in the main index, but
don't know whether thats because of the RSS, or the format, or whether
google knows the blog-tool i'm using (although I've hacked it's code to
shreds). You might want to watch out for that if you're doing
commercially sensitive stuff with your feed.