Function IP Block

Discussion in 'Feature Requests' started by Uno, Jun 14, 2024.

  1. Uno

    Uno Member

    I think it would be nice and useful to have a form on the panel to indicate the (incoming) IPs and IPs classes to be blocked. It can be done manually, but it would be convenient.
    Mine is only a suggestion, not a claim.
     
  2. Taleman

    Taleman Well-Known Member HowtoForge Supporter

    Which IPs are to be blocked? And why?
    Are you talking about fail2ban blocking IPs? My signature has link to Fail2ban tutorial.
     
    ahrasis likes this.
  3. Uno

    Uno Member

    I was mainly referring to the web server (in particular I use nginx, but the talk is the same for apache etc..). For example, I have a list of bot IPs that I would like to completely prevent from accessing the site. I know, I can do it just with the firewall, but it would be convenient on the panel to be able to enter IPs on the fly or wanting to remove them as well. I can also use the options form for nginx, but every time I make a change it restarts the web server.
     
  4. Uno

    Uno Member

    For example, but it's just an example I don't want to make disrespectful comparisons, try to see the demo of cloudpanel https://demo.cloudpanel.io/ if you open a site management in the tab called security you can block IPs and even Bots by name.
    It looks nice and convenient to me.
     
  5. till

    till Super Moderator Staff Member ISPConfig Developer

    But be aware that CloudPanel is not OpenSource, it's a commercial software where they currently offer free copies to attract new users.
     
    ahrasis likes this.
  6. Uno

    Uno Member

    I like Ispconfig and have been using it for many years, that is a fact. It was just an example to get across what I meant about a feature that I think would be useful.
     
  7. till

    till Super Moderator Staff Member ISPConfig Developer

    I agree that such a feature would be nice to have and it's likely that we implement something similar for ISPConfig.
     
    tal56, MaxT, ScchutzZ and 2 others like this.
  8. Uno

    Uno Member

    For Bad Bots, I added a nginx-badbots.conf file to FailBan, it was there for Apache but not for Nginx. Clean solution and without adding a myriad of IPs I directly added the naming of the various Bad Bots.
    In the last few days I had ClaudeBot keeping 600 bots fixed on my site.
    On the surface, I had no problem with it, however, I was bothered by the idea
     
  9. till

    till Super Moderator Staff Member ISPConfig Developer

    ClaudeBot is really bad indeed, many sites have issues with that bot.
     
    ahrasis likes this.
  10. remkoh

    remkoh Active Member

    First I thought: Why? There are options already.
    But now that you mention ClaudeBot I can fully understand your request :D
     
  11. Uno

    Uno Member

    Before I thought of Fail2ban (tank to Taleman) I had started blocking whole classes of IPs with Ufw... and herein comes the paradox, we struggle to have a few IPs (and they cost money) because they are running out and these break our @@ and have thousands if not millions. I dread when we start using IPv6 in earnest.
    I repeat that the my site had no problems, it was working smoothly, however, I wondered what are these doing on my site? Whatever it is was bothering me.
    So I solved it by putting ClaudeBot directly on the list for Nginx and Fail2ban. If there had been on the panel an option it would have been convenient....
    I think just the interaction for on-the-fly modification of the fail2ban configuration file and restarting it would be good as a starting point.
     
  12. Uno

    Uno Member

    I suppose you've already quarreled with Anthropic :D
     
  13. Uno

    Uno Member

    facebookexternalhit/1.1
    Do you think this bot serves any useful purpose?
     
  14. remkoh

    remkoh Active Member

    Aside from being very annoying from time to time to server admins bots do generally serve a purpose.

    If you're experiencing server load issues then have a look at your websites.
    Especially database loads can skyrocket because of bots crawling if the website doesn't have any proper caching implemented.
    And robots.txt can be of help for excluding certain folders and rate-limiting.
    (Asuming bots respect it, which not all do)

    Example:
    Code:
    User-Agent: *               # any robot
    Disallow:   /private        # disallow this directory
    Crawl-delay: 5              # 1 page per 5 seconds
    
    User-Agent: facebookexternalhit/1.1     # rules targeted at FacebookBot
    Disallow:   /nocrawl        # disallow this directory
    
     
  15. Uno

    Uno Member

    I am certain that the bots have a purpose and I misspelled the question...
    Rephrase: does facebookexternalhit/1.1 have any use for me, for my site?
    Because if it's useful to Facebook or whoever and it doesn't give me any usefulness I only risk that it will do me harm, i don't know if you understand my reasoning.
    It's right what you say about caching and all that, I am in order on these things, but I do it for my legitimate users not for malicious bots that maybe come looking for problems.
    The robot.txt is unfortunately respected only by those bots you would never want to turn away, for example the google one.
     
  16. remkoh

    remkoh Active Member

    Depends on your site's content and your wish if you'd want a proper formatted preview on Facebook among other things when somebody links to your site.
     
  17. Uno

    Uno Member

    Okay, these would be relatively good reasons, in fact I had read about these and for the time being I have not blocked the facebook bot.
    But I'm not entirely convinced of the usefulness of this, though. In the old days someone would link to a piece of content on my site, and people who read that post with that link could come and see.
    With this system they use my content but hold on to users tightly, rather they consume my resources. In good percentage I don't gain anything from it. Maybe someone curious who maybe gets interested in that content and then decides to come and see it anyway.
    Honestly, it bothers me the way. They place dozens of bots on you for days and days and basically make a copy of the site. Basically all search engine bots do that but based on the indexes they create they churn out links to get to see your site, it's different.
     
  18. MaxT

    MaxT Active Member HowtoForge Supporter

    you can create ip lists using Ipset. And then you can add your blocklists to iptables, like this:
    Code:
    iptables -I INPUT -m set --match-set blacklist src -j DROP
    It's easy. Read this tutorial: https://linux-audit.com/blocking-ip-addresses-in-linux-with-iptables/

    Although the best use is automatizing the tasks using scripts, to create block tables with your logs from modsecurity, fail2ban, and etcetera. In example you can create modsecurity rules to catch many crawlers, and then creating bash scripts to build your block tables with the ip's. Every day in an automatic way.

    Without modsecurity also you can parse your apache logs. In example, for one domain:

    Code:
    # grep python /var/log/ispconfig/httpd/domain.com/20240611-access.log
    35.223.180.74 - - [11/Jun/2024:13:18:08 +0200] "GET /404.html HTTP/1.1" 302 753 "-" "python-requests/2.31.0"
    54.251.188.247 - - [11/Jun/2024:13:18:08 +0200] "GET /404.html HTTP/1.1" 301 533 "-" "python-requests/2.31.0"
    18.140.58.113 - - [11/Jun/2024:13:18:08 +0200] "GET /404.html HTTP/1.1" 302 753 "-" "python-requests/2.31.0"
    we can extract all those ips sorted and avoiding duplicates, and we can add it to a file "mylist.txt":
    Code:
    # grep python /var/log/ispconfig/httpd/domain.com/20240611-access.log | awk '{print $1}' | sort | uniq -c | sort -n | awk '{print $2}' >> mylist.txt
    18.140.58.113
    35.223.180.74
    54.251.188.247
    and then clean the file "mylist.txt" of possible duplicates:
    Code:
    sort -u -t . -k 1,1n -k 2,2n -k 3,3n -k 4,4n myfile.txt -o myfile.txt
    Also you can create more lists from public sites, to block TOR, VPN's, malware and many more threats. Some free resources with lists:
    http://iplists.firehol.org/
    https://github.com/hslatman/awesome-threat-intelligence

    Basically, these can be the tasks to automatize scripts to build your own lists. The lists will grow day by day. In this way the malicious activity can decrease after a few weeks. This is specially useful with the malicious networks from big companies like AmazonAWS, GoogleContent. Vultr, and similar. Because it is not a good idea blocking all the networks from AWS and the rest. .

    Ipset can manage 65536 elements/list, and also including the use of CDIR with hashes. There is no problem in having hundred thousand of blocked ip's. I have big blocking lists working in humble vps.
     
    Uno, ScchutzZ and ahrasis like this.
  19. ScchutzZ

    ScchutzZ New Member

    I support this. Definitely web gui ip address blocking would be very good.
     
  20. Uno

    Uno Member

    First of all, I thank you for the time and effort you put into writing about possible solutions that can be used by intervening directly through ssh.
    I had written it from the very first posts, though, that it was not difficult to do the job through firewalls or the like. Personally for bots I found fail2ban convenient.

    But the question is whether it is useful and/or convenient to have a function on the panel that allows us to intervene on the fly without opening a shell.
    If we reason the other way around even managing the http server, mail, dns etc... can easily be done directly from ssh, but the hosting panel (for which we thank the developers) makes life more convenient for us especially in certain situations... for example if I only have mobile while having the app with the terminal it is not exactly super convenient to write commands.
    If I am out, without a pc, and I have to block one or more Ip I would like to be able to do it with 2 clicks.

    At the moment fortunately I haven't had any problems like blocked site and the like, as I have read has happened to others. My hardware and software system is working well so far, but they say it's better safe than sorry
     
    ahrasis, MaxT and ScchutzZ like this.

Share This Page