Difference between revisions of "URL filtering"

From ASRG
Jump to navigationJump to search
Line 8: Line 8:
 
}}
 
}}
  
The idea behind URL filtering is that most spams contains URLs to redirect people to some web site. So, it's enough to extract all URLs present in the body of the message and check them against a blacklist. Primitive filters can static flat file blacklists, and the efficiency and drawbacks are the same of a static list of keywords. Most modern filters use URL blacklists stored in DNS zones, as this is an easier way to distribute these lists.
+
Most spams contain URLs to redirect people to a web site. Software can extract all URLs present in the body of the message and check them against a blacklist. Primitive filters can use static flat file blacklists, and the efficiency and drawbacks are the same of a static list of keywords. Most modern filters use URL blacklists stored in DNS zones, as this is an easier way to distribute these lists.
  
As long as URLs found in spams change very frequently, the maintenance of this kind of blacklist is a hard work and, most of the time, use a lots of spamtraps to collect spams.
+
As long as URLs found in spam change very frequently, maintaining of this kind of blacklist is a hard work andusually needs a lot of spam traps to collect spams.
  
Efficiency of URL filtering is usually something between 50 % and 70 %. False positive rate can be as low as 0.1 %, but some lists are more agressive, and can present a higher false positive rate.
+
Efficiency of URL filtering is usually something between 50 % and 70 %. False positive rate can be as low as 0.1 %, but some lists are more aggressive, and can present a higher false positive rate.

Revision as of 17:08, 30 May 2009

Anti-spam technique: URL filtering
Date of first use: early 2000
Effectiveness: Medium
Popularity: High
Difficulty of implementation: Medium
Where implemented: MTA
Harm: Low


Most spams contain URLs to redirect people to a web site. Software can extract all URLs present in the body of the message and check them against a blacklist. Primitive filters can use static flat file blacklists, and the efficiency and drawbacks are the same of a static list of keywords. Most modern filters use URL blacklists stored in DNS zones, as this is an easier way to distribute these lists.

As long as URLs found in spam change very frequently, maintaining of this kind of blacklist is a hard work andusually needs a lot of spam traps to collect spams.

Efficiency of URL filtering is usually something between 50 % and 70 %. False positive rate can be as low as 0.1 %, but some lists are more aggressive, and can present a higher false positive rate.