Перенос большого количества изображений (30 ГБ) на виртуальный сервер с медленной, старой, Linux-машины?

467
Ryan Gedwill

Мне нужно перенести 200 000 изображений в экземпляр Windows EC2 с очень старой машины Linux. У меня нет большого опыта работы с Linux, поэтому я не пробовал использовать FTP прямо, потому что я не думаю, что у нас есть подходящее программное обеспечение, если это быстрая и простая загрузка, я открыт для решения, но эта вещь очень медленный и раздражающий, особенно для иностранцев Linux. Я пытался перенести изображения на USB-накопитель для загрузки с другого компьютера, но он зависает и останавливается примерно через 1 ГБ. Что бы вы предложили наиболее эффективным способом для достижения этой цели?

1

3 ответа на вопрос

2
Overmind

No matter what actual method of copy/transfer you use, you should pack them into an archive first. That will make sure 2 things happen: the new file you want to transfer is good-to-go (compared to many files that could be unreadable) and the transfer itself will go a lot smoother. Small files are a pain to any type of copying or transferring. Once you have a big archive file, you can use any valid method, including scp mentioned above.

1
davidgo

There is no right answer here - for the most part, file transfers do not use a lot of resources other then bandwidth, so the speed of the machine is probably not to big a deal.

The answer depends on what variant of Linux you have, and how you connect between the client and the server. The most obvious solution - if you have ssh on it is to use rsync - Rsync comes with most distros, or is trivially added by something like "apt-get install rsync" or "yum install rsync". The nice part about using rsync is that if the download fails partway through you just run the command and it will pick up where it left off.

If Rsync is not an option, The next logical solution would be to use scp - to use this you would use "scp -r serverip:/path /destpath". This will work as long as the server has ssh.

If that does not work, try using wget or ncftp to download via FTP. FTP is not a great protocol though.

0
Journeyman Geek

Rereading the problem, turns out the solution is different from what I'd suggested initially. We need something crossplatform, handles file transfers reliably and is dead simple.

bittorrent sync should work. The linux client is a simple, single binary with its own webui. Set up a share there, then use the windows client on the other side to download the directory.

Since the underlying protocol is bitorrent, it should handle any issues, and check for errors fairly gracefully, and there's pretty much no config needed, outside creating the shares.

Похожие вопросы