Do you have a question? Post it now! No Registration Necessary. Now with pictures!
- Posted on
- Ban Spyware From Your System
- Jungle Jim
August 28, 2010, 6:48 am
rate this thread
Just wanting to make public some information I found on Google - people
are compiling lists of servers that attempt to download malware/spyware
to your system, and there is a free tool you can Google to help edit
these lists - it's called HostsMan (short for 'HostsManager'.
Give it a try - it won't disappoint. It also comes with a free
HostsServer program that I run - checks every site you are trying to
visit against the hosts file - if it finds an entry with 127.0.0.1
against its name - no go. One warning though - you'll have to disable
Windows XP's DNS Client Service.
Just getting the word out.
Re: Ban Spyware From Your System
Jungle Jim wrote:
So you just discovered how to use a 'hosts' file compiled by someone
else (that you don't know but choose to trust) to block "bad" sites but
the criteria of which you don't get to define and you never bother to
edit that huge many-thousands list of hosts? Or was it that you
discovered a means of keeping the 'hosts' file up to date by polling the
source(s) for the list(s)?
If you run your own local web server, you'll want to change all those
127.0.0.1 redirect IP address to something else, like 127.0.0.0 (which
gets rejected faster by a web browser even if you don't run a local web
server rather than trying to find if something is listening on a port on
the local host to accept the connection). Of course, with an
auto-updater reinserting new entries into the 'hosts' file, you're back
to the localhost address (127.0.0.1) which was not the best choice
except it now has become the traditional value in these pre-compiled
'hosts' file lists.
There is NO requirement that you disable the local DNS caching client to
use a 'hosts' file. HostMan updates the 'hosts' file. That's its
purpose. Many that use the 'hosts' file recommend to disable the local
DNS caching client but it is not mandatory and rarely helps, anyway.
Local redirects using the 'hosts' file never did a DNS lookup so they
won't be in the local DNS cache.
The 'hosts' file does NOTHING to ban spyware from your system as you
claim. It blocks access to the specified *hosts* (and not to all hosts
for a domain since you must specify a host, not a domain, and cannot use
wildcards - and why you end up with 70+ entries for doubleclick.com
alone). There are no selections you can elect for what type of "bad"
hosts to block using this ancient trick of listing them in the 'hosts'
file. Whomever compiled that list that you blindly use shoved in hosts
that have been identified (through means and criteria that are not
defined to you) as sources for advertising, intellitext popovers, and
some are for malicious or infected sites (which often disappear after
just 4 hours). Sites that are not blocked because they are not in the
list can still deliver malware onto your host. You think only the hosts
in this list are the ones that are infecting or deliberately
proliferating malware, rogueware, or hijackware?
Any site that sends you to another site by its IP address never involves
a DNS lookup so the 'hosts' file isn't involved, either, and you end up
at the malicious site, anyway. In fact, many times I see an IP address
used to address the malicious site or a rogue site (like where they con
the visitor with bogus results from a fake anti-virus scan). If DNS is
not used to lookup a hostname to an IP address then the 'hosts' file is
also not used (to do a preliminary lookup). A host addressed using an
IP address never involves DNS. Humans like names. Computers only use
numerical addressing. If the numerical address is used then the
computer already has what it needs to make a connection to the target
The 'hosts' file will do absolutely nothing to prevent malware from
getting into your host. All it does it block you from visiting (or web
pages from accessing) "bad" hosts, like ad sources. They may include
infected or malicious hosts but they are few in number in the
pre-compiled 'hosts' compared to the ones listed because they are an ad
source. In fact, because malicous and phishing sites often come and go
in a few hours, a static list even if updated daily (the pre-compiled
'hosts' files never get updated this often) is very out of date. You
should already be using a screening filter included in your web browser
or as an add-on that gets updated very quickly. These
anti-phishing/malware listing services are updated very often whereas
pre-compiled 'hosts' files are usually many DAYS between updates.
Forget trying to use someone else's pre-compiled 'hosts' file to block
"bad" sites and just rely on them to reduce unwanted (advertising)
content from the web pages that you visit. However, and more often I
see this, a web site can detect if you don't download their ad content
(it has to be delivered through their server and not linked to somewhere
else). If they detect that you don't download their ad content, they
can choose to not deliver that web page, the rest of it, or present a
crippled or reduced-content page. That means you don't get to use their
site unless you accept all of their content. There aren't a lot of
sites that do this but it seems I'm hitting a couple more every week.
Using a 'hosts' file or ad-blocker add-on or proxy can result in not
being able to visit a site or it doesn't function properly. For
example, if an ad-block blocks a domain but the site uses scripts from
that domain then the site might not function correctly. I've seen this
handle a CAPTCHA security page for login. Instead they rely on a 3rd
party service to which that login data is sent (hopefully encrypted) and
where the CAPTCHA validation is performed. If you happen to block that
site in your 'hosts' file or with an ad-block, that CAPTCHA service is
not reachable and the site doesn't get confirmation to let you log into
their site. Using a 'hosts' file or ad-blocker, you might see errors
appear when rendering the page. Sometimes that is because the site
accesses another site but you are blocking that other site. Try
blocking all the Google domains (not just the ones with "google" in
their DNS name but any also owned by Google). You'll find lots of sites
where you see "gat" and "jquery" errors which means the scripts don't
run properly on the site that you chose to visit. I believe just
blocking on googleapis.com will result in scripts failing at many sites.
In the pre-compiled 'hosts' files that I've seen, they always include
the Google hosts (but some will omit the API site since that where the
scripts are called from by many sites).
Unlike many ad-blockers that let you switch them off and on, you cannot
simply turn off the use of the 'hosts' file. You would have to manually
rename the file, purge your web browser's TIF cache, refresh the page,
and afterward rename back the renamed 'hosts' file.
Your claim is false and misleading. A 'hosts' file does not prevent
malware from getting into your host. It might prevent you from visiting
a malicious host but that only works for the very few bad hosts that are
included in a pre-compiled 'hosts' file and only for the bad hosts that
remain around for days or weeks (to get identified, to get reported, to
get into an updated 'hosts' file, to get distributed). Updates on
'hosts' file is slow. If you are relying on them to protect you from
malware, just toss all security and run your host wide open. You'll be
just as "protected".
- » Could not uninstall Avira without registry changes
- — Next thread in » Anti-Virus Software
- » 2 blue screens in Windows 7 32 Bit after after Avira actions
- — Previous thread in » Anti-Virus Software