Unfortunately no one can be told what fun_plug is - you have to see it for yourself.
You are not logged in.
Hi,
i'm going to buy a dns323. Among other things i'm thinking about
using it for some nzb-downloads... i read about some people who got
nzbget, sabnzbd or hellanzb working. (i would prefer sabnzbd) But i'm
interessted in some reports about the performance (max downloadrate, threads)
from people who are using any nzb downloader on the dns323...
Which nzb dl solution is the quickest?
And what about the pp for par2 check & repair? and unrar? is it bearable
on big file-groups (10gb for example)?
thanks for ya help!
Greets
btw: if it's important: the "other things", i want to use the dns323 for, are:
- upnp media sharing for ps3
- ftp-server
- smb for my macs
- perhaps as recording space for my sat receiver
Last edited by bluebeer (2009-02-08 12:18:39)
Offline
nzbget with CH3SNAS I've created a howto
http://www.aroundmyroom.com/2009/01/27/ … laces-all/
nzbget is the best to go for, does only need fun_plug and some simple tools not heavy packages.
Performance? I download with 2MB/s without problem. Threads: I use 4 threads, enough for my usenet server and my download speed
Par2 is done by nzbget & unrar is done by a modified script for the NMT Mediaplayer, also described totally in the howto
NZBget is running really stable, sometimes on very special file(s) the server can choke, but a reboot will solve the issue simple. I use it daily and I've downloaded more than 2TB or even more with nzbget already.
Last edited by zeroday (2009-02-09 09:31:42)
Offline
Coool! Sounds very good! Thanks for your reply!
btw i ordered the dns-323 yesterday
anyone using sabnzbd? i cannot imagine that it is running as good as nzbget - but i like it.
i use sabnzbd on my linux server
p.s. i have a 32mbit/s line
Offline
I've tested both sabnzbd and nzbget.
sabnzbd seems to be more RAM consuming, and it needs around 20 shadow processes to run. It worked fine for most of my downloads, but I experienced a problem on a big download (several hundreds of files for a total > 10 GB).
I didn't experience any problem since I switched to nzbget.
I can't tell you much about performance as far as I'm limited to 2Mb/s. I can download at this rate without problem with both sabnzbd and nzbget.
Par checking and unrar works fine also, even for huge archives (50 GB). Of course, it takes a bit more time to complete than on a top-level PC.
For me, the only drawback of nzbget is that it doesn't have a web interface like sabnzbd, and it needs a bit more efforts to configure it to your needs. But once it's done it works perfectly. I think you can even add a web interface also, but I didn't try this.
@zeroday : do you really mean that you download at 2MB/s (16 Mb/s) with 4 threads ? Or is it 2Mb/s ?
It seems to be a very high speed for this NAS, and with only 4 threads !
Offline
nickotar wrote:
I've tested both sabnzbd and nzbget.
sabnzbd seems to be more RAM consuming, and it needs around 20 shadow processes to run. It worked fine for most of my downloads, but I experienced a problem on a big download (several hundreds of files for a total > 10 GB).
I didn't experience any problem since I switched to nzbget.
I can't tell you much about performance as far as I'm limited to 2Mb/s. I can download at this rate without problem with both sabnzbd and nzbget.
Par checking and unrar works fine also, even for huge archives (50 GB). Of course, it takes a bit more time to complete than on a top-level PC.
For me, the only drawback of nzbget is that it doesn't have a web interface like sabnzbd, and it needs a bit more efforts to configure it to your needs. But once it's done it works perfectly. I think you can even add a web interface also, but I didn't try this.
@zeroday : do you really mean that you download at 2MB/s (16 Mb/s) with 4 threads ? Or is it 2Mb/s ?
It seems to be a very high speed for this NAS, and with only 4 threads !
DSL is 20Mb/s so the download is around 1900Kb/s 2MB/s
nzbget has webinterface: nzbgetweb
Offline
hey zeroday thanks for your nice howto...
i tried nzbget,sabnzbd and hellanzb... i have to say nzbget is the most lightweight
nzb-downloader compared to the the others on the dns323. it can handle more
threads without killing the cpu
my test with 5 threads reached a speed of 2700KB/s. (more tests will follow after
the setup of the postprocess) - during the dowload top said something about
40-60% cpu usage. i could imagine increasing the number of threads will still
increase the download-speed...
my test with hellanzb and 3 threads (i think) reached a maximum of 1600kb/s with
a cpu usage of 80-90% - increasing the number of threads did not increase the speed
anymore...
if u are thinking about sabnzbd - forget it
with nzbget my harddisk didn't spin down anymore:
i had to put the nzbget "nzb2start" folder AANNDD the "queue" folder on my usb stick...
oohhhh i forgot to mention:
i'm not using fun_plug... i'm in an chrooted debian environment..
running: lighttpd, vsftpd, mediatomb, sshd and the *dns-323-included* samba
Last edited by bluebeer (2009-02-14 13:08:46)
Offline
bluebeer, did you try sabnzbd or sabnzbd+ (http://www.sabnzbd.org/)? sabnzbd+ seems to be less demanding on resources than sabnzbd. Also, it supports rss natively which I don't think nzbget does.
Offline
i used www.sabnzbd.org
Offline
i'm now looking for a script or application to fetch my newzbin-bookmarks and put it into the nzbget-nzb2start folder...
any hints?
Offline
bluebeer wrote:
hey zeroday thanks for your nice howto...
i tried nzbget,sabnzbd and hellanzb... i have to say nzbget is the most lightweight
nzb-downloader compared to the the others on the dns323. it can handle more
threads without killing the cpu
my test with 5 threads reached a speed of 2700KB/s. (more tests will follow after
the setup of the postprocess) - during the dowload top said something about
40-60% cpu usage. i could imagine increasing the number of threads will still
increase the download-speed...
my test with hellanzb and 3 threads (i think) reached a maximum of 1600kb/s with
a cpu usage of 80-90% - increasing the number of threads did not increase the speed
anymore...
if u are thinking about sabnzbd - forget it
with nzbget my harddisk didn't spin down anymore:
i had to put the nzbget "nzb2start" folder AANNDD the "queue" folder on my usb stick...
oohhhh i forgot to mention:
i'm not using fun_plug... i'm in an chrooted debian environment..
running: lighttpd, vsftpd, mediatomb, sshd and the *dns-323-included* samba
how did you manage to get 2700KB/s?
I have a 2500KB/sec line but only get ~ 1200KB/s ..
Last edited by mastervol (2009-02-15 23:32:16)
Offline
don't know what to say.. just increased the threads a little...
average download rate: 2787 KB/s
bluebeer wrote:
i'm now looking for a script or application to fetch my newzbin-bookmarks and put it into the nzbget-nzb2start folder...
any hints?
isn't there anything? a little python script? or perl?
Last edited by bluebeer (2009-02-16 00:12:34)
Offline
I don't know anything about bash scripting but i created a little shell scripts...
it still needs some improvements (response-code...) - but it works!
i'm going to work on it but... hmpf help is really, really welcome
#!/bin/bash export NEWZBIN_USER="XXX" export NEWZBIN_PASS="XXX" export NZB_TARGET_PATH="/XXX/nzbget/nzb-start" BOOKMARK_FILE="/var/tmp/myBookmarks" EXEC_LOAD_REPORTS="/var/tmp/execLoadReports" rm "${BOOKMARK_FILE}" >> /dev/null 2>1 rm "${EXEC_LOAD_REPORTS}" >> /dev/null 2>1 rm "${BOOKMARK_FILE}.tmp" >> /dev/null 2>1 wget --post-data "username=${NEWZBIN_USER}&password=${NEWZBIN_PASS}&action=fetch" https://www.newzbin.com/api/bookmarks/ -o /dev/null -O "${BOOKMARK_FILE}.tmp" if [ $? = 0 ] then if [ $(cat "${BOOKMARK_FILE}.tmp" | wc -l) = "0" ] then echo "[INFO] No Bookmarks found - exiting"; exit fi tail -n 5 "${BOOKMARK_FILE}.tmp" > "${BOOKMARK_FILE}" cat $BOOKMARK_FILE | awk -F '\t' '{ user=ENVIRON["NEWZBIN_USER"]; pass=ENVIRON["NEWZBIN_PASS"]; path=ENVIRON["NZB_TARGET_PATH"]; print "wget --post-data \"username="user"&password="pass"&reportid="$1"\" https://www.newzbin.com/api/dnzb/ -o /dev/null -O \""path"/"$3".nzb\""; }' > $EXEC_LOAD_REPORTS chmod +x $EXEC_LOAD_REPORTS $EXEC_LOAD_REPORTS delReports=$(cat "$BOOKMARK_FILE" | awk -F '\t' '{print $1","}') delReports=$(echo $delReports |sed 's/ //g') delReports=${delReports:0:${#delReports} - 1} echo "[INFO] Reports ${delReports} loaded - removing from bookmarks" wget --post-data "username=${NEWZBIN_USER}&password=${NEWZBIN_PASS}&action=delete&reportids=${delReports}" https://www.newzbin.com/api/bookmarks/ -O /dev/null -o /dev/null echo "[INFO] SUCCESS?" else echo "[ERROR] wget failed - user/pass correct?" fi rm "${BOOKMARK_FILE}" >> /dev/null 2>1 rm "${EXEC_LOAD_REPORTS}" >> /dev/null 2>1 rm "${BOOKMARK_FILE}.tmp" >> /dev/null 2>1
it will only work for premium members
UPDATE: improved it a little...
Last edited by bluebeer (2009-02-16 19:10:53)
Offline
since i am not newzbin user .. can you elaborate a little what it exactly does?
Offline
sure...
in newzbin you can easily bookmark a nzb-file-report in your account...
This script will fetch all of your bookmarked reports. Afterwards five of them (there is a limit at newzbin for fetching around 6 per minute i think) will be downloaded into the target directory (which nzbget is watching). Finally these five reports will be removed from your newzbin bookmark-list.
The aim was/is to find/create a(n) application/script which can be run periodically from cron.
If someone is going to use this script on cron - please use at least an interval of 15 or more... ( to take care of your nas & newzbin )
For me it means that i just have to bookmark an nzb-report on newzbin from my iphone and my nas @home will start downloading followed by the postprocessing
help to improve it is still welcome!!!
Last edited by bluebeer (2009-02-18 15:41:08)
Offline
@bluebeer - Nice script! The wget doesn't seem to work with my version of wget (reports 1.12), however I tried the command out on cygwin and it did work (wget 1.11.4). What setup are you running on?
On my DNS-323 it complains that --post-data is an invalid parameter.
Offline
debian etch in chroot....
Offline
The wget problem was that the Fonz funplug version was sufficiently different from the regular GNU version. Installing the GNU version of wget got rid of that problem.
However, I am finding that while the script runs perfectly from the command line, when executed from cron the results are quite different (I am quite inexperienced at *nix scripting). The myBookmarks.tmp is getting created, but the execLoadReports is not, thus the script runs to completion but nothing is downloaded.
I have DNS-323 firmware 1.06, fonz funplug 0.5. Any insight to help a scripting newbie?
Once I get this working, the next step is to auto-copy the files over to the Popcorn Hour 110A which should be arriving this week.
Offline
Bit of a newb here - what do I do with the script once I've added in my login/pass and directory paths? Where do I put it, how do I run it, and if it all works, how do I make it check automatically?
Thanks.
AH
Offline
Anyone have insight into how to get this to work?
AH
Offline
greate job @ niels...
niels wrote:
Hi,
I was firmly locked into getting sabnzbd running on my DNS-323 purely because of the newzbin integration. Your script made nzbget a viable option, although it lacked the category ability that sabnzbd had.
I rewrote parts of your script to add this in. I'm not that great at bash scripting either (really prefer perl for this sort of thing) but I'm not sure what others have installed so kept it with bash (although it does require the 'tr' program too).
Attached is the script. If you have any suggestions or comments let me know. I'm passing ownership of all these changes back to you so feel free to claim them as your own or change them to suit.
I might re-write this all in perl at a later date (which would make it much easier to perform error checking etc. etc.) If I do, I'll let you know.
Thanks,
Niels
---8<---
#!/bin/bash
export NEWZBIN_USER="XXX"
export NEWZBIN_PASS="XXX"
export NZB_TEMP_PATH="/XXX/newzbin"
NZB_TARGET_PATH="/XXX/nzb"
BASH="/bin/bash"
BOOKMARK_FILE="/var/tmp/myBookmarks"
EXEC_LOAD_REPORTS="/var/tmp/execLoadReports"
rm "${BOOKMARK_FILE}" >> /dev/null 2>&1
rm "${EXEC_LOAD_REPORTS}" >> /dev/null 2>&1
rm "${BOOKMARK_FILE}.tmp" >> /dev/null 2>&1
wget --post-data "username=${NEWZBIN_USER}&password=${NEWZBIN_PASS}&action=fetch&limit=5" https://www.newzbin.com/api/bookmarks/ -o /dev/null -O "${BOOKMARK_FILE}"
if [ $? = 0 ]
then
if [ $(cat "${BOOKMARK_FILE}" | wc -l) = "0" ]
then
echo "[INFO] No Bookmarks found - exiting";
exit
fi
cat ${BOOKMARK_FILE} | awk -F '\t' '{
user=ENVIRON["NEWZBIN_USER"];
pass=ENVIRON["NEWZBIN_PASS"];
temp=ENVIRON["NZB_TEMP_PATH"];
print "wget --post-data \"username="user"&password="pass"&reportid="$1"\" https://www.newzbin.com/api/dnzb/ -d -o \""temp"/"$1".dat\" -O \""temp"/"$1".nzb\" > /dev/null 2>&1";
}' > ${EXEC_LOAD_REPORTS}
${BASH} ${EXEC_LOAD_REPORTS}
for file in `ls -1 ${NZB_TEMP_PATH}/*.nzb | awk -F. '{print $1};'`; do
NAME=`grep 'X-DNZB-Name:' $file.dat | awk -F:\ '{print $2};' | tr -d \\\r\\\n`
CATEGORY=`grep 'X-DNZB-Category:' $file.dat | awk -F:\ '{print $2};' | tr -d \\\r\\\n`
mkdir -p "${NZB_TARGET_PATH}/${CATEGORY}"
mv ${file}.nzb "${NZB_TARGET_PATH}/${CATEGORY}/${NAME}.nzb"
rm ${file}.dat
done
delReports=$(cat "${BOOKMARK_FILE}" | awk -F '\t' '{print $1","}')
delReports=$(echo $delReports |sed 's/ //g')
delReports=${delReports:0:${#delReports} - 1}
echo "[INFO] Reports ${delReports} loaded - removing from bookmarks"
wget --post-data "username=${NEWZBIN_USER}&password=${NEWZBIN_PASS}&action=delete&reportids=${delReports}" https://www.newzbin.com/api/bookmarks/ -O /dev/null -o /dev/null
echo "[INFO] SUCCESS?"
else
echo "[ERROR] wget failed - user/pass correct?"
fi
rm "${BOOKMARK_FILE}" >> /dev/null 2>&1
rm "${EXEC_LOAD_REPORTS}" >> /dev/null 2>&1
---8<---
Offline
Once I create the file above, with my newzbin info, what do I do next? Do I set up a cron job that calls this script as often as I want it to check for new bookmarks? I'm pretty clueless here.
Thanks.
A
Offline
When I try to run it, I get the error:
wget: unrecognized option `--post-data'
How do I fix this?
Thanks.
A
Offline