Problem blockings bots, crawlers and others

Discussion in 'General' started by Night Fly, Dec 10, 2019.

  1. Night Fly

    Night Fly New Member

    Dear all...

    My system is ubuntu server 14.04, Apache 2.4 with ispconfig3

    On the last month i realize a big amounth of trafica coming from bad bots, like dotbot, SeznamBot, AhrefsBot
    , and other fellows....

    After search google i realize we can block them with the following code...

    Go to: etc/apache2/apache2.conf
    SetEnvIfNoCase User-Agent "^DotBot" bad_user
    SetEnvIfNoCase User-Agent "^AhrefsBot" bad_user

    <Directory />
    <RequireAll>
    Require all granted
    Require not env bad_user
    </RequireAll>
    </Directory>




    But the problem i think is because in ispconfig3 i have a rewrite from HTTP to HTTPS... And this is the main cause for the above code do nothing...

    One of experiences i made is the following...

    I have for example 2 sites and the default apache page in /var/www

    /var/www/html - Default apache file
    /var/www/site1.com - Site1
    /var/www/site2.com - Site2

    I try to make a small experience block all sites on my server, for that i place in default apache2.conf in /etc/apache2/

    <code>

    <Directory /var/www>
    Options Indexes FollowSymLinks
    AllowOverride All
    Require all denied
    </Directory>
    </code>

    With this i expect to block all sites in my server, but i realize one default apache page is blocked and when i try to access typing the ip on browser i got the message

    Forbidden
    You don't have permission to access / on this server.

    But the other sites (Site1.com and site2.com) si still can access with no issues...

    With this i beleave the problem can be th HTTP to HTTPS rewrite set on ispconfig....

    How can make the above code work for the sites with HTTPS rewrite????

    I appretiate any help...

    Best Regards
    João Carrolo
     

Share This Page