Filezilla / Maximum single file size on bigwork?
| Autor | Nachricht |
|---|---|
|
Verfasst am: 23. 12. 2011 [22:40]
|
|
|
hhoffmann
Holger Hoffmann
Themenersteller
Dabei seit: 21.12.2011
Beiträge: 26
|
Hi, several attempts to upload a 46 GB file with filezilla to an empty folder in /bigwork/usr/... were interrupted at about 25 GB (Previous upload of 23.877 GB was successfull). No real error message appears. Just at 25 GB of completed upload a message appears: "The file already exists..." with the options to abort or overwrite it. In the case of the latter the upload starts again from zero, displaying the same message at 25 GB and so on. Is there an apparent reason for that? Does anyone know a solution? Thx, Holger |
|
Verfasst am: 09. 01. 2012 [10:09]
|
|
|
cochrane
Paul Cochrane
Dabei seit: 14.09.2010
Beiträge: 139
|
Hi Holger, sorry for the late answer! Are you copying your files from Windows or from Linux? If you are copying from Linux, you can use =rsync= with the =--partial= option so that interuppted transfers can be restarted from where they left off. Also, why are you copying into /bigwork/usr...? You don't have permission to write there! Or do you mean /bigwork/<username>? As far as I know there shouldn't be any limit to copying files other than a *possible* timelimit. Are you copying the files from within the university or from home? Maybe a screenshot of the FileZilla window after the problem has occurred could help. Cheers, Paul |
|
Verfasst am: 10. 01. 2012 [10:06]
|
|
|
hhoffmann
Holger Hoffmann
Themenersteller
Dabei seit: 21.12.2011
Beiträge: 26
|
Hello Paul, thx and no worries. OS: W7, 64bit, using FileZilla 3.5.2: FileZilla Client ---------------- Version: 3.5.2 Build information: Compiled for: i586-pc-mingw32msvc Compiled on: x86_64-unknown-linux-gnu Build date: 2011-11-08 Compiled with: i586-mingw32msvc-gcc (GCC) 4.2.1-sjlj (mingw32-2) Compiler flags: -g -O2 -Wall -g -fexceptions Linked against: wxWidgets: 2.8.12 GnuTLS: 2.10.4 Operating system: Name: Windows NT 6.1 (build 7601, Service Pack 1) Version: 6.1 Platform: 64 bit system "Also, why are you copying into /bigwork/usr...? You don't have permission to write there! Or do you mean /bigwork/<username>?" ==> Sorry, I meant /bigwork/<username> "As far as I know there shouldn't be any limit to copying files other than a *possible* timelimit." ==> Configuring FileZilla for possible time-out as in http://forum.hostek.com/showthread.php?230-How-to-disable-FTP-timeout-in-Filezilla led to abortion after ~30 GB. Therefore at the moment I switched to ~15 GB tar-files which, which do fine. "Are you copying the files from within the university or from home?" ==> Institute. "Maybe a screenshot of the FileZilla window after the problem has occurred could help." ==> is attached kind regards, Holger |
|
Verfasst am: 10. 01. 2012 [13:48]
|
|
|
cochrane
Paul Cochrane
Dabei seit: 14.09.2010
Beiträge: 139
|
Hi Holger, thanks for the detailed info! That's very helpful! "hhoffmann" schrieb: "As far as I know there shouldn't be any limit to copying files other than a *possible* timelimit." ==> Configuring FileZilla for possible time-out as in http://forum.hostek.com/showthread.php?230-How-to-disable-FTP-timeout-in-Filezilla led to abortion after ~30 GB. Therefore at the moment I switched to ~15 GB tar-files which, which do fine. How long does the upload take? Maybe there's a timelimit from our side which then becomes obvious. Cheers, Paul |
|
Verfasst am: 12. 01. 2012 [10:27]
|
|
|
hhoffmann
Holger Hoffmann
Themenersteller
Dabei seit: 21.12.2011
Beiträge: 26
|
Hi Paul, sorry for the late answer: It actually works somehow. After configuring FileZilla for possible time-out as in http://forum.hostek.com/showthread.php?230-How-to-disable-FTP-timeout-in-Filezilla a second attempt was successfull. I noticed that upload paused at after ~1 h and then resumed later on. Afterwards bigwork/<username> became very slow up to not responding. Is this related to http://www.rrzn.uni-hannover.de/forum.html?&tx_mmforum_pi1%5Baction%5D=list_post&tx_mmforum_pi1%5Btid%5D=134 ? I hope its not due to that file thx once more, Holger |


