Sam Gleske on 17 Jul 2013 08:07:54 -0700

[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: [PLUG] iptables: dropping bogus application-level content

If you're trying to limit the scope of traffic accessing your site then white listing is the way to go.  .htaccess should only be used in cases where you have multiple virtual hosts on a single system and you must open up the port to a wider range of traffic for other virtual hosts (but not necessarily the one you wish to limit).  And even then you shouldn't use .htaccess but modify the virtual host conf file  for the virtual host located in /etc to account for this.

robots.txt isn't respected by any browser.  It is designed to define rules for indexing robots of search engines.  i.e. you don't want your site to show up in google at all or portions of it so this is when you create robots.txt.

Other than securely browsing your website you shouldn't be blocking any useragents to your content.  If your application doesn't function in anything but one browser (oh god I hope not) then you should modify your application to throw up a warning to your user notifying them that they can't browse for a specific reason or that they need to change browsers.  If you block their traffic for that reason they'll just assume your website is down and not come back.

How about you tell us what you're actually trying to accomplish?  What's your end goal?
Philadelphia Linux Users Group         --
Announcements -
General Discussion  --