Google Loses

Do you have a question? Post it now! No Registration Necessary.  Now with pictures!

Threaded View,5936,9800237%255E15306,00.html


Re: Google Loses

Quoted text here. Click to load it,5936,9800237%255E15306,00.html
Quoted text here. Click to load it

That's what happens when a big company starts to think they are so powerful
they don't have to listen anymore. All they seem to be fixated with is
backlinks. A backlink(s) does not make for a good site, it is too easy for
sites to abuse. Many of the sites in front of me just have sister sites
feeding links back and forth. Their basic site offers less than mine but
they get better serps and PR.

Google is trying to fix something that wasn't really broken and in the
process is causing a lot of harm to both themselves and small companies that
relied on them for a fair showing.

Re: Google Loses

On 11 Jun 2004 09:25:06 -0700, (Tim Arnold) wrote:

Quoted text here. Click to load it

<devil's advocate comments>
Did anyone else notice that the person from NI, Mr. Jones -  who
helped decide that they opted to go with Overture, is later said [in
the article] to formerly been employed by Yahoo [whom now owns

Although, then again, I don't use Overture as a search engine - tried
it a few times in the past but the results weren't always "up to par"
in my opinion. May be nice for advertisers but, as someone seeking
information or certain content, I didn't feel the results were all
that great.

If the article said they went with Yahoo or MSN over Google then I may
have given it more thought ... but not an offshoot of Yahoo
[Overture]; especially after seeing how AllTheWeb is now since Yahoo
acquired it.
</devil's advocate comments>


Re: Google Loses

C.W. wrote:
Quoted text here. Click to load it


  and NI is hardly the most Internet savvy company in the world.

Re: Google Loses

The only really good way to list sites would be by their content but how
can a computer based search engine be able to review content of a site?
Not possible so they base it on other factors which unfortunately can be
manipulated. Human based search engines like dmoz can review content and
the whole purpose of dmoz was to be able to list by content but where
humans are concerned greed gets in the way and the wind up is good
content is the editors sites and everyone else's content stinks. So
we're back to the same scenario of the serps being manipulated only in a
more direct way. Personally I'd rather deal with lots of other seo's
manipulating the serps of computer based searched engines then have to
deal with one human editor in charge of the on/off switch. We still have
a 100% better chance dealing with google's flaws than human flaws at
dmoz where we have zero chance of opporatunity. I can only think of one
way to make things work better and that would be to list sites by
computer, not based on content or popularity but rather alphabetized in
listing and rotated on a 24 hour days so that every site in it's serp
goes from page one to page 1000 within a 24 hour period. That would be
the fairest way towards seos and webmasters but always the best way for
the viewers as some really bad sites would find their way to page one
and high up that should be in oblivion and some great sites would spend
time in oblivion. Probably the best way of SE is what google is
presently doing with their 57 data centers and that is rotating the
results from all 57 of them which all have slightly different algos. At
least that way you get to see some different but good sites high up and
on page one of their respected serp. If anybody can think of a better
way I'd like to hear about it and I'm sure Google would too. until then
in my opinion Google doesn't lose it WINS big time as there is no other
search engine that comes even close when searching for good and relevent

Re: Google Loses

Quoted text here. Click to load it

Considering the number of factors that can be taking into account when
generating a SERP and the possibilities for the algorithms variations I
wonder how Google progressively improves their algorithm.

I would guess that they have databases for each factor and can then run
different algorithms when generating the SERP listings.   In this case it
would make sense to run experimental algorithms at some data centres and
compare the results from the point of view of the visitor.   If you apply
some particular test algorithm and the visitors are less satisified with the
SERPs then you have changed the algorithm the wrong way.  If the visitors
are better satisfied then you have changed the algorithm for the better.

Is anyone interested in helping me build a list of all the factors that
might be taken into account ?  If so, please send me any ideas and I will
add them in.

Best regards, Eric.

Site Timeline