DSM-G600, DNS-3xx and NSA-220 Hack Forum

Unfortunately no one can be told what fun_plug is - you have to see it for yourself.

You are not logged in.

Announcement

#26 2009-07-28 08:08:24

daemonx
Member
Registered: 2008-11-07
Posts: 17

Re: Rapidshare.com download (premium) with DNS-323

The script works great and finally don't have to manually type in the command line. (Although we can always use the up/down arrow keys to scroll thru past commands, but once the NAS is restarted, gotta re-type the line again tongue ). Thanks for working up such a good script!

Glad to hear it works well for you. You dont have to retype after restarting if you add a line to invoke the script in the crontab file. I've added to check for new jobs every 1 hour automatically.

BTW,if you directly add entries into the crontab file it will not persist after a reboot. You'd have to modify fun_plug.init file to add entries automatically at each reboot.

Aria2c on Megaupload
Having tried & tested aria2c on rapidshare with success, i tried it on megaupload but failed. Using both the basic typing input and daemonx's script also don't work. What i got was a HTML page asking for text input verification. Any of you have similar results? If you are able to get it working on megaupload (or using another command method besides aria2c), will very much appreciate your kind sharing. Thanks!

I'm assuming u've a Premium MU account. Have you give the correct login credentials cookie wise or on the command line? I could be wrong,but i think its asking you for CAPTHCA text or something?

Offline

 

#27 2009-07-29 19:19:16

bzhou
Member
Registered: 2008-02-15
Posts: 171

Re: Rapidshare.com download (premium) with DNS-323

See also http://code.google.com/p/plowshare/ and http://code.google.com/p/slimrat/
To deal with CAPTCHA, you'll need OCR package such as tesseract-ocr.
All these available in optware.

Offline

 

#28 2009-07-30 06:02:18

arofarmer
New member
From: SG
Registered: 2009-06-26
Posts: 3

Re: Rapidshare.com download (premium) with DNS-323

daemonx: Yes, i'm having a premium MU account and both the login id n password are correct... and you are right - its asking me for CAPTCHA text.

bzhou: Thanks for the tip on dealing with CAPTCHA. Will readup and try out the OCR package that you mentioned.

Thanks guys!

Offline

 

#29 2009-08-10 21:50:28

albjes
New member
Registered: 2009-08-10
Posts: 1

Re: Rapidshare.com download (premium) with DNS-323

bzhou wrote:

See also http://code.google.com/p/plowshare/ and http://code.google.com/p/slimrat/
To deal with CAPTCHA, you'll need OCR package such as tesseract-ocr.
All these available in optware.

Hello, Plowshares using it to download megaupload premium account, but when you finish downloading the file gives me a failure and I do not save the file anywhere.

Sorry for my english.

Offline

 

#30 2009-08-22 23:27:24

dbaby7
Member
Registered: 2009-08-17
Posts: 16

Re: Rapidshare.com download (premium) with DNS-323

Thanks so much for your help arofarmer! I have a script to run your download if anyone is interested. Just change the username and password. Save as Run Download.vbs to desktop or wherever you prefer. You will have to close telnet yourself after its finished. Thanks again aro!!
set WshShell = WScript.CreateObject("WScript.Shell")
WshShell.Run "cmd"
WScript.Sleep 100
WshShell.AppActivate "C:\Windows\system32\cmd.exe"
WScript.Sleep 100
WshShell.SendKeys "telnet 192.168.0.4~"
WScript.Sleep 2000
WshShell.SendKeys "nohup aria2c -d, --dir=/mnt/HD_a2/RAPIDS --http-user=USERNAME --http-passwd=PASSWORD -i, --input-file=/mnt/HD_a2/RAPIDS/rapidlinks.txt &~" 'command
WScript.Sleep 15000
WshShell.SendKeys "exit~" 'close telnet
WScript.Sleep 200
WshShell.SendKeys "{ENTER}" 'command line prompt
WshShell.SendKeys "exit~" 'close command line

Offline

 

#31 2009-08-27 18:28:18

zelduy
Member
Registered: 2009-02-28
Posts: 9

Re: Rapidshare.com download (premium) with DNS-323

guyz... I was just wondering if I could use my dns323 to remote download.. like If I'm out of town and would like to download something to my dns323 using ssh.. can I do this??
I'm using Linksys wrt54gl with dd-wrt...
and I already configure the port forward.. but still I couldnt use ssh from other connection.. is there any other setting??
btw.. here's my port forward setting:
Application: ssh   
Port from: 22   
Protoco: both   
IP Address: 192.168.0.112   
Port to: 22   
Enable:check

thanks guyz...

Offline

 

#32 2009-08-27 19:48:00

Bobby
Member
Registered: 2009-04-05
Posts: 42

Re: Rapidshare.com download (premium) with DNS-323

If you mean remotely download from your NAS when you are in another place then yes, you can either set up lighttpd to download the files off your box via HTTP or simply download off them via FTP, just remember your IP Address and you'll be fine.

Last edited by Bobby (2009-08-27 19:48:30)

Offline

 

#33 2009-08-29 18:54:29

zelduy
Member
Registered: 2009-02-28
Posts: 9

Re: Rapidshare.com download (premium) with DNS-323

yeah thats what I mean...
but can I download it using ssh? like if i'm at outside town and I'd like to download something to my computer.. I just open my terminal and type aria2c from there.. is it possible??

Offline

 

#34 2009-09-08 12:01:48

daemonx
Member
Registered: 2008-11-07
Posts: 17

Re: Rapidshare.com download (premium) with DNS-323

Version 0.2 of my automated script is ready for download. I've added functionality to check if the any files have been download after the RS daily limit has been exceeded. If so, the incomplete file will be deleted and queued again for download. Apart from that fixed a small bug and added the GPL license info. Try it out and let know of any bugs encountered.

Thanks
DaemonX

P.S : There was a minor bug in the version i uploaded earlier. The attachment has been updated to the latest version.

Last edited by daemonx (2009-09-08 17:00:31)


Attachments:
Attachment Icon rs_auto_v0.3.sh, Size: 4,560 bytes, Downloads: 478

Offline

 

#35 2009-09-21 13:33:43

dbaby7
Member
Registered: 2009-08-17
Posts: 16

Re: Rapidshare.com download (premium) with DNS-323

Is there any way to check the percentage complete of the download? Or pause it?

Offline

 

#36 2009-10-03 23:32:11

BurstDragon
New member
Registered: 2009-10-03
Posts: 3

Re: Rapidshare.com download (premium) with DNS-323

dbaby7 wrote:

Is there any way to check the percentage complete of the download? Or pause it?

That would be indeed interesting, cause while the DNS is downloading, one can throw ones internet in the tarsh bin^^"

So the pause function would be marvellous. And to see the actual download speed would be nice as well wink


BTW: much thanks to you, daemonx, your script is super great!


greez Burst

Last edited by BurstDragon (2009-10-04 00:07:28)

Offline

 

#37 2009-10-04 12:33:25

daemonx
Member
Registered: 2008-11-07
Posts: 17

Re: Rapidshare.com download (premium) with DNS-323

Is there any way to check the percentage complete of the download? Or pause it?
That would be indeed interesting, cause while the DNS is downloading, one can throw ones internet in the tarsh bin^^"

So the pause function would be marvellous. And to see the actual download speed would be nice as well wink

If your router supports QOS you could enable and configure it in a way that downloads via the DNS will automatically be given a lower priority when ur browsing.

BurstDragon wrote:

BTW: much thanks to you, daemonx, your script is super great!


greez Burst

glad to hear..i've started work on a newer version which will automatically extract multi part rar files .. since i'm busy with work..i've a feeling it will take a while to see the light of the day smile

Offline

 

#38 2009-10-04 22:16:45

BurstDragon
New member
Registered: 2009-10-03
Posts: 3

Re: Rapidshare.com download (premium) with DNS-323

daemonx wrote:

...
glad to hear..i've started work on a newer version which will automatically extract multi part rar files .. since i'm busy with work..i've a feeling it will take a while to see the light of the day smile

That's a nice thing to hear wink And the speed problem isn't annoying anymore, I just download when I'm asleep big_smile


greez Burst

Offline

 

#39 2009-10-14 08:58:34

timezlicer
Member
Registered: 2008-09-02
Posts: 51
Website

Re: Rapidshare.com download (premium) with DNS-323

original source from
http://code.google.com/p/plowshare/
and adapted for optware (also works for DNS-343)

install the dependencies
e.g. sed, curl, recode, imagemagick, tesseract-ocr, ...

extract and run "setup.sh install"

Last edited by timezlicer (2010-02-20 02:20:32)


Attachments:
Attachment Icon plowshare-0.9.1_optware.tar.bz2, Size: 49,823 bytes, Downloads: 353

Offline

 

#40 2009-10-29 00:30:56

geezans
New member
Registered: 2009-10-29
Posts: 2

Re: Rapidshare.com download (premium) with DNS-323

arofarmer and all,

is this method required for a new url file to be created each time the "nohup aria2c ..." been call ?
or can i update with the new links on the same url file so that i only need to maintain one url file each time ?
the habits are new download need to be pushed in each day, and the previous download list may not be completed by the end of the day or the next day when new links need to be put in.

I am just curious about that ... Thanks ...

Offline

 

#41 2009-10-29 20:24:28

geezans
New member
Registered: 2009-10-29
Posts: 2

Re: Rapidshare.com download (premium) with DNS-323

geezans wrote:

arofarmer and all,

is this method required for a new url file to be created each time the "nohup aria2c ..." been call ?
or can i update with the new links on the same url file so that i only need to maintain one url file each time ?
the habits are new download need to be pushed in each day, and the previous download list may not be completed by the end of the day or the next day when new links need to be put in.

I am just curious about that ... Thanks ...

@deamonx .... after trying to understand your latest script ... the script basically has cover the question above. and with cron it for everyhour, the new batch to download inside the new file will be added accordingly. This only happen if there is no aria2c process exist/running so it wont disturb the existing download process.

It is really a great script ... I would say clever ... thanks ....

Offline

 

#42 2009-11-30 11:08:42

reyazi
Member
Registered: 2009-11-12
Posts: 5

Re: Rapidshare.com download (premium) with DNS-323

dbaby7 wrote:

Thanks so much for your help arofarmer! I have a script to run your download if anyone is interested. Just change the username and password. Save as Run Download.vbs to desktop or wherever you prefer. You will have to close telnet yourself after its finished. Thanks again aro!!
set WshShell = WScript.CreateObject("WScript.Shell")
WshShell.Run "cmd"
WScript.Sleep 100
WshShell.AppActivate "C:\Windows\system32\cmd.exe"
WScript.Sleep 100
WshShell.SendKeys "telnet 192.168.0.4~"
WScript.Sleep 2000
WshShell.SendKeys "nohup aria2c -d, --dir=/mnt/HD_a2/RAPIDS --http-user=USERNAME --http-passwd=PASSWORD -i, --input-file=/mnt/HD_a2/RAPIDS/rapidlinks.txt &~" 'command
WScript.Sleep 15000
WshShell.SendKeys "exit~" 'close telnet
WScript.Sleep 200
WshShell.SendKeys "{ENTER}" 'command line prompt
WshShell.SendKeys "exit~" 'close command line

1. Will this work on mac if I changed the directory of the cmd.exe to lets say Terminal.app

2. for the "rs_auto_0.3.sh" Can someone please point out exactly which directories that I need to change, This is my edited version:

Code:

LINKS_OLD=/mnt/HD_a2/Volume_1/Rapid/links_old.txt 
LINKS_NEW=/mnt/HD_a2/Volume_1/Rapid/links_new.txt
ARIA_LOG=/mnt/HD_a2/ffp/logs/aria.log
ARIA_BAK=/mnt/HD_a2/ffp/logs/aria_"`date +%H%M_%d%m%y`".log
TIME_FMT=`date +%T`
LOG_FILE=/mnt/HD_a2/ffp/logs/aria_auto_"`date +%d%m%y`".log
HAS_JOBS=0
SERVICE=aria2c
PATH=/opt/bin:/opt/sbin:/ffp/sbin:/usr/sbin:/sbin:/ffp/bin:/usr/bin:/bin
RUN_STAT=`ps -aef | grep -v grep | grep $SERVICE | wc -l`
#Normally File sizes of incomplete downloads are less than 10KB. Change this value if you have any problems.
INCOMP_SIZE=10000

3. Where can I find coreutils to download it. EDIT: found it its here
The installation is (wget FILE, then funpkg -i FILE) right? does it have to be in a specific folder or i could just install it anywhere for example in /ffp/.

I really appreciate any help and I'm appologize for any of my noob-linux Qs, I'm learning smile.

Thank you,

Last edited by reyazi (2009-11-30 12:06:41)

Offline

 

#43 2011-01-10 23:49:14

Northguy
New member
Registered: 2011-01-10
Posts: 2

Re: Rapidshare.com download (premium) with DNS-323

With the risk of being called a thread-necromancer I thought to share this tidbit with you.

I wanted to download an amount of RAR files from filesonic.com (not rapidshare, but somewhat the same) and found this thread while searching for a good approach. I initially tried WGET and ARIA2C, but ended up using a combination of XARGS And CURL.

1st challenge: authentication to get logged-in automatically. For this I used a Firefox plug-in that extracted my Firefox Cookies to a text file "cookies.txt". This cookies file is to be used by WGET, ARIA2C and CURL.

WGET: Wget (as described in the thread above) did the job perfectly, except for one minor detail Filesonic uses a 302 redirect for their files and WGET ends up changing all filenames to some gibberish, which could not be easily related to the original filename. Tried some searching on the net to solve this, but ended up with nothing. If someone else comes up with a good idea, please share.

ARIA2c: Aria2c (also described above) also worked perfectly (for a while). It even translated the original filenames after the 302 redirect, so I got usable filenames. But somehow, once every 4-5 files a download would crash. And I wanted a good list of files, not checking each file afterwards. Checking the net showed that the latest source version for Aria2c is at a later stage than the one that comes with the instruction above (onlyhype.com), but since I am not known with making compilations, I stopped there.

CURL and XARGS: And now for my approach. Since CURL is not able to use a list of URLs as an input command I constructed the following:

xargs -n 1 -P 6 curl -v -C - --retry 15 -L --cookie cookies.txt -O <filelist.txt &

Where XARGS passes the URLs from filelist.txt to CURL. Short explanation of the used options (for more detail, see the appropriate MAN pages):

-n 1          : make sure that each line from filelist.txt is executed as a single command for CURL
-P 6          : Execute 6 instances of CURL at once for simultaneous downloading
-v             : verbose option for CURL
-C -           : auto-resume. Let CURL find out where to pick up. Mind the space between C and -
--retry 15 : number of retries before transfer fails
-L             : continue transfer on 302 error
-- cookie <file> : cookie file to be used for file transfer
-O            : use same output filename as in URL (minus the path)
<filelist.txt : the reference to the file with URLS (mind the obligatory < for the XARG command)
&               : run process independent of putty. You can close putty. The process continues until finished.

Good luck,

Northguy smile

Offline

 

Board footer

Powered by PunBB
© Copyright 2002–2010 PunBB