Large Files Problem

Discussion in 'Technical' started by chrismfz, Jun 22, 2006.

  1. chrismfz

    chrismfz New Member

    Hello people. This is my first post, be nice with me :)

    "Renting" a box somewhere in the US (I am from Greece)
    I tried to download there first a large file (3.3GB) and from there
    to my workstation in Greece (thats because I wanted this file quick
    in a drive that I know that I own... and of course because the box have 40-50mb speed and in Greece I have the ultimate 384kb speed....(...))

    Gooood, the download complete in about 30 minutes... And when I tried to
    download from my server I realised after some ...reading that I cannot
    download files bigger than 2GB from apache either pureftp that I am running..

    And from asking help how to make this possible, I prefer to ask
    how can I split this file and break it into pieces of eg 500MB
    and then download them one by one...

    I tried a script named split-tar but the file that it creates it was the same size with the original one (...).

    Any suggestions ?

    ps: Forgive me for my awful stinky english... :eek:


    Thanks in advance,
    Christos Panagiotakis
     
  2. dishawjp

    dishawjp New Member

    Christos,

    There is a pretty standard Linux utility called split. See "man split" for usage, but what I think you will be looking for is the -b flag to set the size to something like 500MB.

    split -b 500000000 <filename>

    Use the "cat" command to reassemble your file once you've downloaded it to your local machine.

    BTW, Your English is great; nothing to apologize for :)

    HTH,

    Jim
     
    Last edited: Jun 22, 2006
  3. chrismfz

    chrismfz New Member

    Thanks a lot!
     

Share This Page