My server got a light reddit hug .. how to prevent it from going down in the future?

Discussion in 'General' started by Kira, Feb 24, 2016.

  1. Kira

    Kira Member

    Hi,

    my website runs on an apache2 server in Debian that I setuped following the guides with ISPConfig.
    Usually there are about 5000-10000 visitors per day on my site, however .. a few days ago, after the site was mentioned on reddit there was a sudden increase in that and the site became unresponsive..

    The site itself is rather simple, but has a lot of small images (approx. 150), so each visitor generates 150+ hits on my server. Since that was causing issues already with the standard user numbers I turned off KeepAlive, so each image request is a totally new request on its own. This worked like a charm until that reddit storm.

    During the time my site was barely reachable from outside, I fiddled around a bit with the apache settings and increased the prefork MaxClients from 150 to 5000 ... that was the only quick-and-dirty solution I could come up with. And it helped .. the site became much more responsive afterwards again.


    However, last night - about 27h after I increased that MaxClients limit, my web-server became completely unresponsive again (during the reddit storm it was just slow.. after a few retries it usually worked somewhat, maybe a few pictures were missing etc). I SSH'ed into my server and tried to restart apache as a quick-fix but got this error message:

    Code:
    [....] Restarting web server: apache2
    WARNING: MaxClients of 5000 exceeds ServerLimit value of 256 servers, lowering MaxClients to 256. To increase, please see the ServerLimit directive.
     ... waiting Segmentation fault
    .sleep: error while loading shared libraries: libc.so.6: failed to map segment from shared object: Cannot allocate memory
    ./etc/init.d/apache2: 166: /etc/init.d/apache2: Cannot fork

    So apparently my server ran out of memory..? I guess..?
    I tried to start it once more and this time it worked:

    Code:
    [....] Restarting web server: apache2
    WARNING: MaxClients of 5000 exceeds ServerLimit value of 256 servers, lowering MaxClients to 256. To increase, please see the ServerLimit directive.
    There's a good chance the massive amount of people thing will happen again (it's a game related website .. and with each patch of that game, the site gets that boost of visitors). what's the best/proper way to prevent the site from going down?


    I read something about about changing apache from mpm_prefork to mpm_worker .. however, that doesn't seem to be that easy? At least my apache doesn't have any of these modules.. also, I am using mod_wsgi and am not too sure if that is compatible with mpm_worker? And I guess my solution about the MaxClients to 5000 (and there seems to be a cap at 256 anyway) isn't the very best idea ether?

    When checking the version of my apache, it says that it's compiled with mpm_prefork.


    Any ideas? Hints? Tips?

    Thank you!
    Kira
     
  2. ztk.me

    ztk.me Well-Known Member HowtoForge Supporter

    Yeah, very likely you ran out of memeory :)
    Handling such a load on one machine .... requires a real powerhorse depending on your configurtation / site.
    Basically use CDNs to deliver your static documents, at least use some leightweight static only fully optimized for static content webserver on a different IP or machine.

    If you are using PHP and apache mpm-prefork it loads everything ( php, other modules ) for sending a static picture ...
    You could switch to mpm-event, which is much better for handling more requests thus may need some reconfiguration and won't play with some extensions like mod_php => use FPM, mod_ruby mod_python mod_suphp ... well depending on your needs there are howtos to allow those scripts to run anyway.

    Use proper caching, you probably don't need to generate every request dynamicly, use caching - not only opcode-caching which is ineffective using mpm-prefork by the way cause of no shared memory.

    Hope I got you into the right direction, there's plenty what one could/should do to boost sites performance. Oh btw. do you use sessions for new/anonymous users? Are sessions stored on (slow) disks? Also check your database/file-systems queries for each request...
     
  3. Kira

    Kira Member

    Thanks a lot for that info :)

    All the in-depth Apache business is still new to me.. and before that crash I didn't even knew about prefork, event, worker .. etc. and to be honest, still am not sure if I fully understand it. But I guess there's lots of stuff to read about on the web :)


    Splitting the data up to two servers sounds like a good idea, however .. I do need a second server for that.. not too sure if I want to do this already.


    If Apache loads all the modules and everything even for a simple static picture request tho, that doesn't sound too good ... is it possible to run nginx and apache both in parallel on the same machine?



    My site doesn't have any user related things, so there's no sessions. But there's some Database queries, so I'll make sure those are optimized.
     
  4. ztk.me

    ztk.me Well-Known Member HowtoForge Supporter

    There are some howtos out there how to setup nginx in front of apache to catch static file requests and send .php requests to an upstream handler like apache. If you need IP-logging or something like that, you might need to check for some additional work to rewrite those so your REMOTE_ADDR is not 127.0.0.1 for example.

    If you have a second IP option, you can use that one on your server - or just use some CDN which even might cost you a few cents but could be cheaper than another server and it's defnitly better for spreading content on large scale.

    Apache prefork is pretty slow, its processors don't share memory so every instance has its own opcode caches. And as said carries every module on its own. Reminds me: check your keep-alive settings, lower the timeout or disable it completly to minimize processses to wait for new requests from the same client while blocking other requests it could handle meanwhile.
     
  5. Kira

    Kira Member

    Do you by any chance have any recommendations for a good CDN service? :)

    KeepAlive is turned off already.
     
  6. ztk.me

    ztk.me Well-Known Member HowtoForge Supporter

    There are different kind of CDN systems out there. Some will require you to change your DNS settings to point to them to do their "magic" like cloudflare.

    Probably the easiest one I recently tried is https://cloudinary.com
    Since they offer easy uploading/remote loading options and gallery like browsing it's perfect for my use case since I'm within the limits of their free account =)

    Of course there is amazon aws/s3 cdn and several others. It depends on what you need or want to change on your site to make it work for you.

    Ps.: Many JS-library files are already available for free on many CDNs, google developer network or https://cdnjs.com/
     

Share This Page