Again, 2gb limit size..

Discussion in 'General' started by fobicodam, Sep 21, 2006.

  1. fobicodam

    fobicodam New Member

    Ok, now i HAVE a problem... the same old f"·$)($" linux problem, how can it be that php cant open files bigger than 2gb ?!?!?!?! can anyone help me with that?? the log files are 4, 5, 9 Gb and webalizer cant open them !!!

    Plus: cant you put some kind of error handling for this cases? one site has an error and no stats for anyone !!!
     
  2. till

    till Super Moderator Staff Member ISPConfig Developer

    Thes solution will be that the php developers are dropping the 2 GB file limit, I really dont know why any other modern programming language except of php seems to support large files :(

    The current workaround is to create daily logfiles, there must be some threads about that in the forum including some patching advice.
     
  3. fobicodam

    fobicodam New Member

    Yes, i guess i was the first to suggest the daily log file, but i have a BIG problem now, one site individual daily log is 4.2GB !!!
     
  4. Ben

    Ben Active Member Moderator

    @Till: What about doing sth. like

    head -2 web.log | tail -2
    head -4 web.log | tail -2
    head -6 web.log | tail -2
    ...

    to read a file from shell blockwise via shell (here 2 lines) for each call
     
  5. Turophile

    Turophile New Member

    Run an hourly cronjob to check, parse and rotate :)
     
  6. fobicodam

    fobicodam New Member

    I asked for that a long time ago and everybody says it will corrupt the webalizer stats..
     
  7. Turophile

    Turophile New Member

    I've used it in the past with much success though, I'm not sure why it'd corrupt it.
     
  8. fobicodam

    fobicodam New Member

    Do you say i can run the "log/webalizer" scripts whitout problems every hour?!?!?!? :eek:
     
  9. Turophile

    Turophile New Member


    Sure, just setup the cron to:
    webalizer,
    logrotate (with -f)

    It works fine here. You need to rotate the log out otherwise it may re-parse all the data.
     
  10. fobicodam

    fobicodam New Member

    Can you write here the exact config for cron?

    thanks you.
     
  11. till

    till Super Moderator Staff Member ISPConfig Developer

    I think the current problem is to write to append data to a logfile that exceeds the 2GB limit. I guess the best solution will be to write the logs.php script again in a language that supports files > 2GB, e.g. perl. But I dont know perl very well so I'am not able to reimplement it. Any perl programmers out there? :D
     
  12. fobicodam

    fobicodam New Member

    Ok, but... what do i do with my client?!?!?! he is a semi important client !! (as you see 4gb visit log !!) and he NEEDs the stats !!! :confused:

    Till, is there any problem if i put the log_/webalizer scripts inside another and run it hourly??
     
  13. Ben

    Ben Active Member Moderator

    @Till but if so, you can add lines to a file as well with cat or sth. like this.

    Because if you will redo that in perl, you have to duplicate all configfiles that are included in that logfile for the needed config parameters.
     
  14. till

    till Super Moderator Staff Member ISPConfig Developer

    I guess thats too slow, the current logsplit script is highly optimized, it uses a pool of file handles. If you open / close a file handle evry time you write a line like the script was written in the first release it took abot 50x longer.

    But maybe a slow implementation is better then a non working :)

    Yes, thats right. But I guess the config.inc.php file can be parsed with a regex to extract the parameters as only the database settings are needed.
     
  15. fobicodam

    fobicodam New Member

    Hey, stop, ok? thanks.

    The problem here is not the split code, the log files is splited very well on every client folder.

    BUT

    a client's log file is bigger than 4GB and webalizer HANGS. IF i rename the log file to "web_xx.txt" for example, webalizer runs ok and everything is fine. But i NEED the stats for that site.
     

Share This Page