Unfortunately no one can be told what fun_plug is - you have to see it for yourself.
You are not logged in.
Hello,
I'm sorry if I have asked this before, but I got no response. I'm using firmware 1.06 and I'm unable to download file from DNS if this file is >4GB. I use FileZilla and NTFS file system on destination computer and when I reach 4GB limit download just brakes. Sometimes download is infinite and it downloads more than file size, I mean if file size is 5GB than it downloads 7GB and continues to donwload to infinite. I posted this question to d-link forum and hope to get an answer, but I want you guys and girls tell me if someone here has the same issue ?
Regards,
alpha
Offline
I don't understand.... No replies here, no replies in D-link forums. Is it so ultra secret or there is no such issue at all or maybe everyone uses files less than 4GB. Is it so difficult to write here like "... I don't use large files" or "I have no issue with large files".
Regards,
alpha
Offline
Alpha
You want an answer, any answer - here's two:
- I don't use ftp
- there is no file on my system as big as 4 gig
Happy now?
Biscotte
Offline
Biscotte wrote:
Happy now?
... a little bit.... Actually you don't need to take it personally. I believe that there are lots of people using FTP with large files and I'm waiting for them to answer.
Offline
Fair comment on the first part.
My experience mainly observing, is that this is a very responsive, well informed site. Answers such as ''have you tried rebooting'' just don’t appear. Just take a look at Fonz, Fordem, Mig and ShadowAndy's contribution to name but four.
Hence if anyone has a solution to your problem or has a pointer to a problem, it is very likely that they will provide it - and quickly.
For info - bigger than 4gig files were a problem, but I understood it was solved with the 1.05 FW.
I am just fed up that in recent times, a number of posters have appeared on this site who unreasonably bump as though it is their God-Given right to get an answer . . . right now.
Biscotte
Offline
Biscotte wrote:
I am just fed up that in recent times, a number of posters have appeared on this site who unreasonably bump as though it is their God-Given right to get an answer . . . right now.
You got me wrong....
And you're right - this site is informative, but it is not the first time I get no answer for maybe some noob question. It seems that some people are too good to answer noob questions. Ok. not a problem for me.
Back to my question. Its just too hard to believe for me that no people use large files (>4GB). By the way, you mentioned 1.05 firmware, but I have no issues with large files overall, I just have issue with FTP+large files. These large files I got through torrent client and there was no problem. Only problem is with FTP. And I'm not searching a sollution, but first I want to know if its commond issue. If so, then we need to wait for next firmware. If I'm the one who have this issue then I need to search for sollution. Thats all. No offence.
Regards,
alpha
Offline
Thanks for the props Biscotte
Like Biscotte, and presumably others, I have not responded because I have nothing of value to contribute - I'm not a frequest user of the ftp server and I have no files larger than perhaps 2GB to run a test with.
Offline
I'm sorry guys if my second post was too rough. I didn't mean. I just could never believe that no one uses large files. Now it seems for me that this is true. So posts like ".. I don't use large files" is very informative because I know now that its true. I thought that some one opens post and if he sees no answers than he writes that he don't use and after 3-4 post everyone will know that this is the fact, we don't need 1000 post about no one uses large files. Couple of posts is enought to know. So anyway thanks.
Regards,
alpha
Offline
alpha wrote:
Hello,
I'm sorry if I have asked this before, but I got no response. I'm using firmware 1.06 and I'm unable to download file from DNS if this file is >4GB. I use FileZilla and NTFS file system on destination computer and when I reach 4GB limit download just brakes. Sometimes download is infinite and it downloads more than file size, I mean if file size is 5GB than it downloads 7GB and continues to donwload to infinite. I posted this question to d-link forum and hope to get an answer, but I want you guys and girls tell me if someone here has the same issue ?
Regards,
alpha
I just did the same thing. No issues. Filezilla client/winxp/ntfs on host - dns-323/pureftp stock client/firmware 1.06. Just transfered a 9gb file from the NAS to my desktop locally @ 10.2mbs....
Now... My dad tried to download a similar sized file from a couple thousand miles away and it timed out on him while he was sleeping and when he went to resume it, filezilla wouldn't let him. That was quite awhile ago. Forget what error he said he got. Maybe something to the effect of the server does not support resuming files larger than 4gb....
Without getting off my LAN, I don't feel I've done an adequate test.... But.... Locally, it works over here. Then again, for transfering files locally, using FTP makes no sense.
Offline
Thank you madpenguin. You are always nice and simple guy, allways ready to help. Thank you for spending your time on the tests.
So my problem was from external IP. Now I'm trying 9GB file in home network, but using FTP. We will see if problem still exists. I'll let you know about results.
Regards,
alpha
Offline
Yes. The same situation here. No problems when transferring large files using FTP in local network. Problem only exists if I connect to FTP from external IP address. So I think this is not pureFTP or D-link's problem. I don't know whose problem it is. Maybe router ? My DNS-323 sits behind router with port forwarding and FTP uses not port 21 from external IP. I use port forward ExIP:SomePort -> LocIP:21. So who has any ideas ?
Regards,
alpha
Offline
madpenguin wrote:
Then again, for transfering files locally, using FTP makes no sense.
<off-topic>
Huh, why? isn't FTP is the most efficient way of transferring large files? I use FTP on LAN all the time.
</off-topic>
<unhelpful>
I have no problem with large files (8,9 GB) locally. Never tried to access the NAS from external host.
</unhelpful>
Offline
bogolisk wrote:
<unhelpful>
I have no problem with large files (8,9 GB) locally. Never tried to access the NAS from external host.
</unhelpful>
Try from external if its possible for you and I think you'll get the same results as me and madpenguin.
Regards,
alpha
Offline
Yea. This is an issue that I never bothered to look into. I only have a standard RoadRunner connection with 60kbs up. Trying to download a large file from an external IP will take forever and a year so I didn't bother to trouble shoot it.
I'd be willing to bet that if you did the same local test only briefly unplugged your cat5 cable after 4 gb, you'd probably get the same error. One thing I've noticed when ftp'ing from across the internet is that your connection constantly stalls and then has to reconnect again. This is where we are running into problems I think.
With a local gigabit connection, you don't get these timeout issues occurring. So.... Don't rule out pure-ftp. The problem, more than likely, lies with the server.
Try to enable filezilla logging and find out what it's saying, if anything.
Offline
bogolisk wrote:
madpenguin wrote:
Then again, for transfering files locally, using FTP makes no sense.
Huh, why? isn't FTP is the most efficient way of transferring large files? I use FTP on LAN all the time.
No more efficient than just copy/pasting in windows explorer. You still get the same transfer rate.
Offline
I'll tell you what alpha... This is what I did to side-skirt the issue. It doesn't allow for uploading but I just wanted my family and friends to be able to download stuff, not necessarily upload anything.
I have a directory (not visible) somewhere in my webroot tree that is password protected with md5 digest (lighttpd.conf). I also use a redirect from http to https using a self signed certificate with openssl. So... It doesn't matter if they try to connect to that directory with http://, they get re-routed to use https:// instead. This happens before they have to enter their username and password.
Once they get in, it's a directory listing and they can browse/download to their hearts content. I think Firefox does a good job at resuming large files just incase they stall because it uses a ".part" file. Any download manager will also work to help prevent wasted time downloading in case you get bumped.
Anyway. Food for thought. So. I just keep ftp disabled on the 323. If I find myself across country and I want to re-enable ftp for whatever reason, I can ssh into the box and start it ( I have a dyndns account). If you setup your router for remote access, you can also turn on/off port forwarding from far away too, tho I don't like the idea of having remote administration enabled on the router.
All you would need to do is install openssh for windows on someones computer to get into your box. It's a small and unobtrusive program so people shouldn't mind if you ask them.
Last edited by madpenguin (2009-03-18 18:58:09)
Offline
Also, as another work around hack, you could use winrar and dice your large file up into 50mb chunks. Then someone could download all the chunks and re-assemble them on their computer.....
Not ideal, but untill the root of the problem is recognized and fixed, it should work.
My dad is NOT going to want to mess with 200 different 50mb rar files so I opted for the password protected directory listing under lighttpd using SSL.... username... password... download...
When ever possible, I DO NOT use native dlink applications. Way too buggy. Without fonz's funplug, there would be absolutely nothing special about the 323... Too bad really that dlink can't get their sh*t together but atleast we have funplug.
You might try to build a better ftp client for funplug and try your luck at that. I have a sneaking suspicion that you won't have any problems if you use another FTP server....
Last edited by madpenguin (2009-03-18 19:03:16)
Offline
madpenguin wrote:
Also, as another work around hack, you could use winrar and dice your large file up into 50mb chunks. Then someone could download all the chunks and re-assemble them on their computer.....
Yes ! This is a very good idea !!! I'll try to do this and it will be just like I want.
madpenguin wrote:
You might try to build a better ftp client for funplug and try your luck at that. I have a sneaking suspicion that you won't have any problems if you use another FTP server....
I agree with this also. I think we'll need to wait for some fixes about this problem.
You mentioned to use openssh for windows. Isn't this the same as PuTTY ?
Regards,
alpha
Offline
alpha wrote:
You mentioned to use openssh for windows. Isn't this the same as PuTTY ?
For the most part. I prefer BSD's implementation.
http://sshwindows.sourceforge.net/
I have yet to find a good terminal for windows but have been using PowerShell
http://www.microsoft.com/windowsserver2 … fault.mspx
If I need to do extensive ssh work, I just boot into linux and use xterm/gnome-terminal/xfce-terminal.
I don't care much for ftp otherwise I'd try making a build script for one. I think the best bet is to get chroot working with openssh's sftp.... vanilla ftp is a little dated now-a-days and just exposes one more port on my box. Since I have ssh running, seems best to use sftp.
Last edited by madpenguin (2009-03-19 15:15:50)
Offline
madpenguin wrote:
I think the best bet is to get chroot working with openssh's sftp....
Have you tried this ? I tried just to connect to FTP through SSH and was success, but every user got access to root folder. I have no idea how to chroot evey user....
Offline
Search here on the forum. It's been discussed many times. There might be something on the wiki if I'm not mistaken.
Using sftp/openssh might also negate your original issue with not being able to download large files too.
Last edited by madpenguin (2009-03-19 19:03:02)
Offline
I just thought about security.... Ok if I don't need to use sFTP, but I need to somehow disable user access to shell. I mean all FTP users can access my DNS via SSH. Is it any way not to let users in into box via SSH, only FTP.... If I use /bin/false as login shell for user then this user can't login to FTP too. I'm searching forums about this, but no luck....
Offline
madpenguin wrote:
Using sftp/openssh might also negate your original issue with not being able to download large files too.
I know.... but I have allready sollution for this from you: splitting files into part is the best sollution IMO, because large files are not so very often in our lives
Offline
alpha wrote:
Is it any way not to let users in into box via SSH, only FTP.... If I use /bin/false as login shell for user then this user can't login to FTP too. I'm searching forums about this, but no luck....
vi /ffp/etc/ssh/sshd_config #--add the below-- PermitRootLogin no AllowUsers alpha DenyUsers admin nobody billy joe ralph
Change your port number too. Every little bit helps. Don't forget to do it in your router as well...
Then do a chmod 600 on it. From now on, ssh in as "alpha" and then "su -" to root if that's who you need to be.
Last edited by madpenguin (2009-03-20 02:10:31)
Offline
Thanks again for helping...
madpenguin wrote:
Then do a chmod 600 on it. From now on, ssh in as "alpha" and then "su -" to root if that's who you need to be.
Do you mean to allow only limited user to login and then do a su command to act like root ? Am I understand correctly ?
One more question. I saw that there exist new version of openSSH, so if I need to install it I must enable telnet, uninstall old openSSH, install new and then disable telnet ? Is it right ? I'm just asking for it coz I don't want to get any more trouble with DNS323
Regards,
alpha
Offline