Hi all, I have an ubuntu 20.04 server with 1 gig of ram on t2.micro aws. I had everything running smoothly except a few of the functions were not working on one of my php apps with 7.4. I was told by the developer to downgrade to 7.2. I am still running php7.4 as default then added Ondrej Sury repository and installed 7.2 specifically for the 1 app that needed it and added it under the additional php in ispconfig and connected it to that site only. After i did this the site worked great! All functions worked. Then i ran into the site becoming unreachable and I couldnt even SSH into the server to check for issues. After restarting the server several times I was able to get in for a few mins before the server was no longer responsive again. However i was able to pull logs and view the top cmd for a bit. Here's where it froze. Code: MiB Mem : 978.6 total, 48.0 free, 854.6 used, 76.1 buff/cache MiB Swap: 0.0 total, 0.0 free, 0.0 used. 13.0 avail Mem PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 82 root 20 0 0 0 0 R 90.0 0.0 10:54.74 kswapd0 1450 root 20 0 723816 11016 0 S 1.8 1.1 0:11.66 ssm-agent-worke 603 root 20 0 714184 8944 0 S 1.7 0.9 0:19.98 amazon-ssm-agen 288 root 0 -20 0 0 0 D 1.6 0.0 0:17.16 loop1 658 root 20 0 642812 11484 0 S 1.5 1.1 0:16.25 f2b/server 94 root 0 -20 0 0 0 I 0.7 0.0 0:05.44 kworker/0:1H-kblockd 846 mysql 20 0 1274800 97404 0 S 0.7 9.7 0:16.51 mysqld 491 root 20 0 721928 12432 0 S 0.5 1.2 0:07.37 snapd 2520 gdm 20 0 3251752 77656 2500 D 0.4 7.7 0:06.79 gnome-shell 560 memcache 20 0 406260 1112 424 R 0.3 0.1 0:04.68 memcached 3995 root 20 0 46712 1032 760 D 0.3 0.1 0:00.83 php7.4 3962 root 20 0 16424 816 260 D 0.3 0.1 0:00.82 systemd-tmpfile 596 ntp 20 0 74660 2256 1660 D 0.3 0.2 0:04.01 ntpd 3030 ubuntu 20 0 11244 3188 2208 R 0.3 0.3 0:03.83 top 164 root 19 -1 53924 4360 2864 D 0.3 0.4 0:04.07 systemd-journal 470 message+ 20 0 8612 3476 1884 D 0.2 0.3 0:03.66 dbus-daemon 3883 web3 20 0 67512 4160 1908 D 0.2 0.4 0:01.04 php 3973 web3 20 0 52460 3148 1560 D 0.2 0.3 0:00.61 php 4014 root 20 0 8616 2692 2440 S 0.2 0.3 0:00.59 server.sh 3869 web3 20 0 67512 4272 2016 D 0.2 0.4 0:01.78 php 3930 root 20 0 65704 4144 1924 D 0.2 0.4 0:00.58 php 3965 web21 20 0 52328 3240 1680 D 0.2 0.3 0:00.58 php7.4 423 systemd+ 20 0 24488 6804 2292 R 0.2 0.7 0:00.76 systemd-resolve 4012 root 20 0 8616 1892 1664 D 0.2 0.2 0:00.56 cron.sh 4028 root 20 0 8616 2760 2524 S 0.2 0.3 0:00.56 cron.sh 2702 gdm 20 0 312564 2588 1680 D 0.2 0.3 0:02.21 gsd-housekeepin 3880 web21 20 0 67512 4152 1900 D 0.2 0.4 0:01.19 php7.4 3588 www-data 20 0 71852 14088 2216 R 0.2 1.4 0:01.02 /usr/sbin/apach 3872 web21 20 0 67512 4172 1916 D 0.2 0.4 0:02.12 php7.4 3897 web2 20 0 50604 3980 2100 R 0.2 0.4 0:00.58 php-cgi 3919 root 20 0 65536 3972 1764 D 0.2 0.4 0:00.55 php 3975 web3 20 0 53280 3832 2088 D 0.2 0.4 0:00.55 php 3835 www-data 20 0 71732 12656 908 D 0.2 1.3 0:00.68 /usr/sbin/apach 3946 root 20 0 63744 4116 1980 D 0.2 0.4 0:00.55 php 3970 web2 20 0 48888 3296 1868 D 0.2 0.3 0:00.56 php-cgi 3974 web3 20 0 52328 3704 2132 D 0.2 0.4 0:00.53 php 4026 root 20 0 8616 1692 1456 D 0.2 0.2 0:00.53 server.sh 3820 web21 20 0 113976 9188 2960 D 0.2 0.9 0:02.51 php7.4 3963 web21 20 0 52328 3500 1936 D 0.2 0.3 0:00.52 php7.4 3971 web3 20 0 52460 3476 1884 D 0.2 0.3 0:00.54 php 3972 web3 20 0 52460 3764 2176 D 0.2 0.4 0:00.53 php 3944 web2 20 0 49020 2768 1276 R 0.2 0.3 0:00.54 php-cgi 3947 root 20 0 63832 3652 1512 D 0.2 0.4 0:00.52 php 3922 root 20 0 65704 3940 1720 D 0.2 0.4 0:00.51 php 3830 root 20 0 113976 9372 3192 D 0.2 0.9 0:02.45 php 3920 root 20 0 65704 4060 1840 D 0.2 0.4 0:00.49 php 3945 root 20 0 63832 3976 1832 D 0.2 0.4 0:00.50 php 3964 web21 20 0 52328 3608 2048 R 0.2 0.4 0:00.49 php7.4 3929 root 20 0 65704 4428 2212 D 0.2 0.4 0:00.49 php 3931 root 20 0 65704 4124 1912 D 0.2 0.4 0:00.48 php 3948 root 20 0 63744 4256 2124 D 0.2 0.4 0:00.48 php 3949 root 20 0 63744 4356 2220 D 0.2 0.4 0:00.47 php 3950 root 20 0 63744 4232 2100 D 0.2 0.4 0:00.48 php 3826 web3 20 0 113976 8844 2644 D 0.2 0.9 0:02.30 php 3967 web21 20 0 52328 3004 1472 D 0.2 0.3 0:00.47 php7.4 There are a TON of php process running. Where as before i installed 7.2 there was maybe 3 or 4 MAX. I don't have traffic to the site at all. Just me testing things out. Does 7.2 use significantly more memory? Or do i just not have a large enough server two run to versions of php?
I also checked my logs when everything became unreachable and found: Code: [ 580.209812] Out of memory: Killed process 498 (clamd) total-vm:266560kB, anon-rss:187848kB, file-rss:2212kB, shmem-rss:0kB, UID:118 pgtables:468kB oom_score_adj:0 [ 1057.969058] Out of memory: Killed process 1666 (/usr/sbin/amavi) total-vm:163444kB, anon-rss:131872kB, file-rss:1404kB, shmem-rss:0kB, UID:121 pgtables:352kB oom_score_adj:0 [ 1463.496312] Out of memory: Killed process 723 (mysqld) total-vm:1254928kB, anon-rss:92532kB, file-rss:0kB, shmem-rss:0kB, UID:114 pgtables:440kB oom_score_adj:0 [ 1521.999886] cloud-init[2493]: Cloud-init v. 20.3-2-g371b392c-0ubuntu1~20.04.1 running 'modules:final' at Fri, 27 Nov 2020 23:23:32 +0000. Up 1072.82 seconds. [ 1526.303412] cloud-init[2493]: Cloud-init v. 20.3-2-g371b392c-0ubuntu1~20.04.1 finished at Fri, 27 Nov 2020 23:30:49 +0000. Datasource DataSourceEc2Local. Up 1509.55 seconds [ 1858.289353] Out of memory: Killed process 1817 (php7.4) total-vm:127704kB, anon-rss:19272kB, file-rss:3388kB, shmem-rss:0kB, UID:5008 pgtables:236kB oom_score_adj:0 [ 2212.194998] Out of memory: Killed process 1830 (php7.4) total-vm:127704kB, anon-rss:19272kB, file-rss:3460kB, shmem-rss:0kB, UID:5008 pgtables:228kB oom_score_adj:0 [ 2351.084799] Out of memory: Killed process 1805 (php7.4) total-vm:127704kB, anon-rss:19272kB, file-rss:3408kB, shmem-rss:0kB, UID:5008 pgtables:232kB oom_score_adj:0 [ 2459.267802] Out of memory: Killed process 1779 (php7.4) total-vm:127704kB, anon-rss:19268kB, file-rss:3204kB, shmem-rss:0kB, UID:5008 pgtables:228kB oom_score_adj:0 [ 2560.921026] Out of memory: Killed process 2324 (php7.4) total-vm:127704kB, anon-rss:19252kB, file-rss:3084kB, shmem-rss:0kB, UID:5008 pgtables:224kB oom_score_adj:0 [ 2642.919824] Out of memory: Killed process 1733 (php7.4) total-vm:127704kB, anon-rss:19268kB, file-rss:3284kB, shmem-rss:0kB, UID:5008 pgtables:232kB oom_score_adj:0 [ 2677.856796] Out of memory: Killed process 1999 (php7.4) total-vm:127704kB, anon-rss:19272kB, file-rss:3360kB, shmem-rss:0kB, UID:5008 pgtables:232kB oom_score_adj:0 [ 2786.610020] Out of memory: Killed process 1902 (php7.4) total-vm:127704kB, anon-rss:19268kB, file-rss:3252kB, shmem-rss:0kB, UID:5008 pgtables:232kB oom_score_adj:0 [ 2837.026541] Out of memory: Killed process 2173 (php7.4) total-vm:127704kB, anon-rss:19244kB, file-rss:2320kB, shmem-rss:0kB, UID:5008 pgtables:216kB oom_score_adj:0 [ 2904.920246] Out of memory: Killed process 2286 (php7.4) total-vm:127704kB, anon-rss:19160kB, file-rss:3384kB, shmem-rss:0kB, UID:5008 pgtables:228kB oom_score_adj:0 [ 2983.023429] Out of memory: Killed process 1585 (php7.4) total-vm:127704kB, anon-rss:19268kB, file-rss:3468kB, shmem-rss:0kB, UID:5008 pgtables:224kB oom_score_adj:0 [ 3075.822043] Out of memory: Killed process 2374 (php7.4) total-vm:127704kB, anon-rss:18988kB, file-rss:3532kB, shmem-rss:0kB, UID:5008 pgtables:240kB oom_score_adj:0 [ 3192.221347] Out of memory: Killed process 2323 (php7.4) total-vm:127704kB, anon-rss:19240kB, file-rss:2588kB, shmem-rss:0kB, UID:5008 pgtables:232kB oom_score_adj:0 [ 3295.535995] Out of memory: Killed process 967 (/usr/sbin/apach) total-vm:294544kB, anon-rss:14464kB, file-rss:2120kB, shmem-rss:4096kB, UID:0 pgtables:252kB oom_score_adj:0 [ 3314.854741] Out of memory: Killed process 2240 (php7.4) total-vm:125656kB, anon-rss:18124kB, file-rss:2088kB, shmem-rss:0kB, UID:5008 pgtables:216kB oom_score_adj:0 [ 3314.907198] Out of memory: Killed process 2513 (php7.4) total-vm:123608kB, anon-rss:15660kB, file-rss:3400kB, shmem-rss:0kB, UID:5008 pgtables:220kB oom_score_adj:0 [ 3472.642084] Out of memory: Killed process 2514 (php7.4) total-vm:123608kB, anon-rss:15172kB, file-rss:3620kB, shmem-rss:0kB, UID:5008 pgtables:220kB oom_score_adj:0 [ 3585.065403] Out of memory: Killed process 2348 (php7.4) total-vm:123608kB, anon-rss:15888kB, file-rss:3340kB, shmem-rss:0kB, UID:5008 pgtables:220kB oom_score_adj:0 [ 3706.557289] Out of memory: Killed process 2385 (php7.4) total-vm:123608kB, anon-rss:15632kB, file-rss:3300kB, shmem-rss:0kB, UID:5008 pgtables:216kB oom_score_adj:0 [ 3894.674646] Out of memory: Killed process 995 (postgrey --pidf) total-vm:31840kB, anon-rss:16932kB, file-rss:1972kB, shmem-rss:0kB, UID:119 pgtables:104kB oom_score_adj:0 [ 4220.160660] Out of memory: Killed process 1071 (/usr/sbin/apach) total-vm:295352kB, anon-rss:15056kB, file-rss:1432kB, shmem-rss:156kB, UID:33 pgtables:240kB oom_score_adj:0 [ 4269.107788] Out of memory: Killed process 1709 (/usr/sbin/apach) total-vm:294976kB, anon-rss:14680kB, file-rss:1692kB, shmem-rss:80kB, UID:33 pgtables:240kB oom_score_adj:0 [ 4341.707603] Out of memory: Killed process 1059 (/usr/sbin/apach) total-vm:295352kB, anon-rss:15024kB, file-rss:1660kB, shmem-rss:140kB, UID:33 pgtables:240kB oom_score_adj:0 [ 4362.752383] Out of memory: Killed process 2760 (/usr/sbin/apach) total-vm:294836kB, anon-rss:14544kB, file-rss:1628kB, shmem-rss:84kB, UID:33 pgtables:240kB oom_score_adj:0 [ 4394.314671] Out of memory: Killed process 1204 (/usr/sbin/apach) total-vm:294896kB, anon-rss:14600kB, file-rss:1564kB, shmem-rss:172kB, UID:33 pgtables:240kB oom_score_adj:0 [ 4452.129435] Out of memory: Killed process 1058 (/usr/sbin/apach) total-vm:295200kB, anon-rss:14972kB, file-rss:1664kB, shmem-rss:160kB, UID:33 pgtables:240kB oom_score_adj:0 [ 4516.223182] Out of memory: Killed process 3217 (/usr/sbin/apach) total-vm:294836kB, anon-rss:14536kB, file-rss:1712kB, shmem-rss:84kB, UID:33 pgtables:240kB oom_score_adj:0 [ 4569.659987] Out of memory: Killed process 1707 (/usr/sbin/apach) total-vm:294976kB, anon-rss:14692kB, file-rss:1740kB, shmem-rss:80kB, UID:33 pgtables:240kB oom_score_adj:0 [ 4660.335738] Out of memory: Killed process 1205 (/usr/sbin/apach) total-vm:294904kB, anon-rss:14620kB, file-rss:1544kB, shmem-rss:140kB, UID:33 pgtables:240kB oom_score_adj:0 [ 4737.761226] Out of memory: Killed process 2790 (/usr/sbin/apach) total-vm:294852kB, anon-rss:14548kB, file-rss:1980kB, shmem-rss:84kB, UID:33 pgtables:240kB oom_score_adj:0 [ 4765.634224] Out of memory: Killed process 1198 (/usr/sbin/apach) total-vm:294896kB, anon-rss:14608kB, file-rss:1536kB, shmem-rss:120kB, UID:33 pgtables:240kB oom_score_adj:0 [ 4799.009551] Out of memory: Killed process 3214 (/usr/sbin/apach) total-vm:294836kB, anon-rss:14536kB, file-rss:1620kB, shmem-rss:84kB, UID:33 pgtables:240kB oom_score_adj:0 [ 4817.845585] Out of memory: Killed process 1064 (/usr/sbin/apach) total-vm:294896kB, anon-rss:14604kB, file-rss:1708kB, shmem-rss:184kB, UID:33 pgtables:240kB oom_score_adj:0 [ 4830.901384] Out of memory: Killed process 3225 (/usr/sbin/apach) total-vm:294836kB, anon-rss:14536kB, file-rss:1012kB, shmem-rss:68kB, UID:33 pgtables:240kB oom_score_adj:0 [ 4849.403453] Out of memory: Killed process 3263 (/usr/sbin/apach) total-vm:294816kB, anon-rss:14504kB, file-rss:1416kB, shmem-rss:68kB, UID:33 pgtables:240kB oom_score_adj:0 [ 4859.658050] Out of memory: Killed process 1708 (/usr/sbin/apach) total-vm:294852kB, anon-rss:14560kB, file-rss:1428kB, shmem-rss:68kB, UID:33 pgtables:240kB oom_score_adj:0 [ 4941.955792] Out of memory: Killed process 565 (named) total-vm:218288kB, anon-rss:14024kB, file-rss:940kB, shmem-rss:0kB, UID:123 pgtables:132kB oom_score_adj:0 [ 5012.052106] Out of memory: Killed process 2335 (hwe-support-sta) total-vm:33068kB, anon-rss:9640kB, file-rss:3304kB, shmem-rss:0kB, UID:0 pgtables:96kB oom_score_adj:0 [ 5105.686476] Out of memory: Killed process 591 (php-fpm7.4) total-vm:280372kB, anon-rss:7112kB, file-rss:3392kB, shmem-rss:2688kB, UID:0 pgtables:220kB oom_score_adj:0 [ 5119.154302] Out of memory: Killed process 1740 (php) total-vm:117252kB, anon-rss:8984kB, file-rss:3412kB, shmem-rss:0kB, UID:0 pgtables:208kB oom_score_adj:0 [ 5341.941704] Out of memory: Killed process 622 (f2b/server) total-vm:642816kB, anon-rss:11556kB, file-rss:0kB, shmem-rss:0kB, UID:0 pgtables:160kB oom_score_adj:0 [ 5459.731633] Out of memory: Killed process 2600 (php7.4) total-vm:115220kB, anon-rss:8240kB, file-rss:3084kB, shmem-rss:0kB, UID:5008 pgtables:196kB oom_score_adj:0 [ 5551.199978] Out of memory: Killed process 2539 (php) total-vm:115140kB, anon-rss:7772kB, file-rss:3248kB, shmem-rss:0kB, UID:0 pgtables:204kB oom_score_adj:0 [ 5654.845838] Out of memory: Killed process 2637 (php7.4) total-vm:115000kB, anon-rss:8016kB, file-rss:3208kB, shmem-rss:0kB, UID:5008 pgtables:208kB oom_score_adj:0 [ 5757.332788] Out of memory: Killed process 479 (networkd-dispat) total-vm:29540kB, anon-rss:7792kB, file-rss:3316kB, shmem-rss:0kB, UID:0 pgtables:88kB oom_score_adj:0 [ 5908.941366] Out of memory: Killed process 2464 (php7.4) total-vm:115216kB, anon-rss:8228kB, file-rss:3192kB, shmem-rss:0kB, UID:5008 pgtables:200kB oom_score_adj:0 [ 6025.915680] Out of memory: Killed process 2609 (php) total-vm:115204kB, anon-rss:7804kB, file-rss:3456kB, shmem-rss:0kB, UID:0 pgtables:204kB oom_score_adj:0 [ 6113.520648] Out of memory: Killed process 655 (unattended-upgr) total-vm:108084kB, anon-rss:7648kB, file-rss:3364kB, shmem-rss:0kB, UID:0 pgtables:100kB oom_score_adj:0 [ 6198.804346] Out of memory: Killed process 1018 (ssm-agent-worke) total-vm:723816kB, anon-rss:10788kB, file-rss:0kB, shmem-rss:0kB, UID:0 pgtables:192kB oom_score_adj:0 [ 0.000000] Linux version 5.4.0-1029-aws (buildd@lcy01-amd64-022) (gcc version 9.3.0 (Ubuntu 9.3.0-17ubuntu1~20.04)) #30-Ubuntu SMP Tue Oct 20 10:06:38 UTC 2020 (Ubuntu 5.4.0-1029.30-aws 5.4.65)
Ubuntu 20.04 comes with php7.4 as default php so downgrading it may be the cause of your problem(s). I'd install Ondrej Sury multiple php version and use php7.2 only for the needed apps while keeping php7.4 as default for the server.
Yes this is what I did, I am running php7.4 as default then added Ondrej Sury repository and installed 7.2 specifically for the 1 app that needed it.
So i managed to finally SSH into the server after numerous try and changed the .httacess file back to 7.4. The server seems to be handling it fine again. Does this mean my server cannot handled running to php versions on 1 gig of ram?
It depends on how much memory your php processes use, and how many there are, and (especially) what else is running on the machine. If you have mail service, clamav can use 1g memory by itself, so 1g for the whole server would be too little.
Looks like your host has no swap. The other log shows "Out of memory"; so indeed the memory was exhausted. You can either add swap, 3 GB for example that should prevent out of memory situation or add more memory (RAM), but I would still use also swap, maybe 2 times the size of memory
Hello, if kswapd0 drains a lot of cpu, it means you should enable swap (swap partition o a swap file) as soon as possible. Ispconfig works with 1 GB RAM but it's pretty unusable. I suggest at minimum 2 GB RAM.
Yeah I figured RAM was my issue, I was hoping to keep it on the free tier considering this is only a testing environment before i go live with php applications I'm working on, but ill try adding the swap and see if it is stable enough for testing. I have ISPCONFIG, my 1 php application and whatever else comes pre installed on a LAMP server. I don't have a mail server installed as i didn't need it for this application. clamav is also on the server however I have it scheduled to run around 4am my time so that it never interferes with what I'm working on. Luckily there isn't much data on the harddrive so the process runs relatively smoothly. Thanks for the responses everyone. Also i vaguely remember something about being able to limit the number of php processes. IS this something that is recommended? I noticed with 7.2 it opens up a TON more than 7.4 does. Another thing once i setup the swap, I think the default swappiness is 60. Should i use that value or increase /lower? I was also considering adjusting the cache pressure setting. When i check Code: cat /proc/sys/vm/vfs_cache_pressure I get 100 back. I was thinking 50 might be better so it doesnt remove it so quickly.
As suggested by others above swap is at least you can do for now. IMO swap should be good enough as a temporary solution especially if the vps uses ssd like nvme. Just to note, swap size should only be around 8gb max even if your ram is more than 4gb as more than that is never necessary. Doubling based on ram size is a mere guide only.
i'm assuming you've got all services installed on this vps? ie it also has postfix, amavis, clamav etc. you'll need more physical ram. i'd suggest you upgrade to a vps with at least 2gb of ram. even then, if you have a reasonably busy site, and a fair amount of email, you might find that clamav scans will still occasionally hit memory limits and cause processes to be killed. depending on your threshold for downtime, i would suggest either going to 4gb of ram, or using monit to keep an eye on your essential services and automatically restart them if they get killed. depending on how many sites your installing on the vps, using php-fpm as the php-handler, and configuring it for on-demand use will reduce your overall resource usage by apache/php. on a physical server, especially if you're lucky enough to have a separate physical disk to allocate purely for swap, it would be great idea, but generally, i would suggest not bothering with a swap partition on a vps, it's a disk partition within a file which is itself on a disk partition. if you hit swap on a vps, it's usually game over already. it generally starts needing more memory faster than it can write the currently used memory to the swap, so more swap is needed, it's a vicious circle, everything get's full and the server falls over anyway. but depending on how much your testing, you may just get away with it. i would also, if you really want to keep the vps on the free tier, switch to t3, or t4 instances if they're available in your availability zone. their a bit faster/more efficient, which could reduce the resource load enough to keep memory usage just below the critical level where it starts killing services. you could even try out the t4g instances uses the graviton2 arm based cpu's, amazon's marketing babble: i've tested ubuntu 20.04 with ispconfig 3.2 on it, and it works well, not had time to run a side-by-side comparison with the intel/amd instances to see if there's any performance/resource usage differences though.
This type of behavior would apply to swap space being actively swapped to, but having some swap space allocated helps even without it being swapped to, as processes allocate more memory than they actually use (sometimes much more), and with swap space available, some/most unused memory will be mapped there, allowing more overall allocation to happen before hitting OOM. Even 2G would likely help significantly.
Thanks for all the replies everyone. The swap has made a huge difference. Of course when ClamAv starts running it pretty much a complete hault of server until its done. Do you think it would matter to run this once a week instead of daily? If this was more than a test environment would it be optimal to run this daily? How do i setup for on demand use? I searched for a while and couldnt find much on this topic. This is available. Ill test it out and report back.