Results 1 to 4 of 4

Thread: Preventing some types of User Agents from accessing my website

  1. #1

    Preventing some types of User Agents from accessing my website

    My website sometimes gets hit with recursive wgets and HTTrack downloads that can cripple the web server.

    I'd like restrict access from these programs. Is user agent filtering for these types of scans worthwhile? Here is another example I found.

    Do you know of any other ways to prevent server overload at the Apache / OS level? I don't mind limiting connections, I just don't want the server to crash.
    Last edited by jungleist; 01-18-2007 at 06:42 AM.

  2. #2
    I fould my solution. I just added the examples in these links to my httpd.conf instead of .htaccess. I prefer httpd.conf because its not as easy to forget as .htaccess when doing a website restoration, or troubleshooting.

    Comprehensive guide to .htaccess
    Hijacking - Some Advice for Webmasters

    Looks like it's working so far.

  3. #3
    Sorry, it needs to be in .htaccess.

  4. #4
    I use modsecurity, which is described as an application firewall. This may be a bit much for your needs, though.

Similar Threads

  1. Different types of kill commands
    By Suhas! in forum Linux - General Topics
    Replies: 2
    Last Post: 09-14-2009, 02:31 PM
  2. Replies: 2
    Last Post: 08-06-2007, 12:34 PM
  3. FTP Locked Out Accessing User Web directories: Permissions?
    By brian in forum Linux - General Topics
    Replies: 0
    Last Post: 11-15-2004, 12:39 AM
  4. Mozilla 1.6 MIME types
    By vwgtiturbo in forum Linux - Software, Applications & Programming
    Replies: 4
    Last Post: 01-26-2004, 06:24 AM
  5. MIME TYPES associate, wine
    By DMAN in forum Mandriva
    Replies: 5
    Last Post: 07-27-2002, 05:13 AM

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •