Unfortunately no one can be told what fun_plug is - you have to see it for yourself.
You are not logged in.
Pages: 1
Hello
I just upgraded my DNS-323 to version 1.04 from 1.03.
So far everything looks good except for the Swedish characters in file and directory names coming from the old disk. They seems to be mapped in a wrong way by Samba. I guess it is utf8 iso 9959 related problems as I can create new file names with Swedish characters.
There are probably three ways to deal with this.
1) Let the file names be bad.
2) Invent a script or program that will translate the characters
3) Change the bahaviour of how Samba is dealing with this.
Are there anyone out there that has a similar experience and that know how to fix it?
Best regards
Tomas
Offline
UTF is a known feature of the new firmware. UTF is what you want to use.
If you fun_plug your dns you can make a script and change everything on the DNS.
Offline
So, how do you actually fix this? i have the same problem with Hebrew. After upgrading to 1.04 all my Hebrew file names are completely messed up. the names are still valid but they show up as garbage. How can I fix this?
Offline
Found the answer in another thread:
hah finally have an answer in d-link FAQ, may be useful for others:
"If you are using firmware 1.04, by default the DNS-323 will support Unicode which is an industry standard allowing computers to consistently represent and manipulate text expressed in most of the world's writing formats. However, if you were using an earlier version of firmware (1.03 or earlier) and have just upgraded to 1.04, Unicode will not be supported until you have reformatted the hard drives in your DNS-323."
Last edited by jotka (2008-02-05 18:54:24)
Offline
I just cannot fathom how a hard disk format can fix what Samba does or not does. This is ridiculous. It's just a more convenient explanation from for DLink instead of saying: We haven't made a conversion utility so you have to start from scratch and all will be fine.
I agree that there must be some way of renaming all the files. So the problem is to map the former 255 possible characters into the UTF-8 space. But there is one table for each of the coding schemes used. I used 8859 (BTW, I thought this included Swedish too?) and there is another for Hebrew, etc. Anyone up to a quick ruby script?
Offline
mv [filename] `ls [filename] | iconv -f ISO-8859-1 -t UTF-8`
Provided that you have the command iconv.
If you install chroot debian it should be no issue.
Offline
Or use this tool with chroot debian:
sudo apt-get install convmv
cd /<path to>
convmv -f ISO-8859-1 -t utf-8 -r --notest *.*
Offline
Well, good news: if you're using raid 1 (mirroring) then formatting the drives will enable unicode support without deleting the data (i assume they format each disk separately or something).
However, files that are already 'garbaged' will not automatically be fixed so the best way would still be to:
1 - backup
2 - upgrade to 1.04
3 - format
4 - restore from backup
Offline
loopback wrote:
Well, good news: if you're using raid 1 (mirroring) then formatting the drives will enable unicode support without deleting the data (i assume they format each disk separately or something).
However, files that are already 'garbaged' will not automatically be fixed so the best way would still be to:
1 - backup
2 - upgrade to 1.04
3 - format
4 - restore from backup
I just HAVE to ask this - are you speaking from experience? Have you tried this? Were you ACTUALLY able to format RAID1 without losing your data?
Or is this simply an assumption?
Last edited by fordem (2008-02-21 01:32:43)
Offline
Pages: 1