DSM-G600, DNS-3xx and NSA-220 Hack Forum

Unfortunately no one can be told what fun_plug is - you have to see it for yourself.

You are not logged in.

Announcement

#1 2009-01-27 08:52:48

lambretta
Member
Registered: 2009-01-07
Posts: 27

rTorrent and RSS feeds - a solution and its light!

I am using rTorrent on my DNS-323.  It is a very good light weight torrent client.

On my old set up I used an RSS plugin which would serve as a method to automatically download a torrent when it was released on a tracker, this was particularly useful for my weekly fix of Top Gear.

The DNS-323 is not a real powerhouse so I needed an RSS reader that was lightweight and that would not, preferably, require more dependencies.  Enter Bashpodder. 

I have using bashpodder for some time now to download mp3 enclosures from RSS feeds at teh BBC and also The Linux Linc Tech Show, its great and just so simple and fast.  Check out the home page at http://www.lincgeek.org/bashpodder/

In short it is a bash scrip with a single conf file in which you put your RSS feeds URL, there is also and XSL file you will need.  I think this should be able to be used to read an RSS feed (set off via a cron job each day) and then download the torrent file to a specified folder.  This folder can be one that rTorrent keeps an eye on and will automatically download.

I have performed some prelim tests on my home laptop running Arch, I will over the next few days trial Bashpodder and let you all know how it goes.

Offline

 

#2 2009-02-20 10:16:05

JediSthlm
Member
Registered: 2009-01-17
Posts: 19

Re: rTorrent and RSS feeds - a solution and its light!

Sounds great. How are the tests going?

Offline

 

#3 2009-03-09 05:06:28

lambretta
Member
Registered: 2009-01-07
Posts: 27

Re: rTorrent and RSS feeds - a solution and its light!

Hello all.

Apologies for the announce and then nothing - life gets in the way with Twins.

Any how, I have messed about with a couple of bashpodder scripts but there have been some issues.

At first I used an old BP script and this seems to download the .torrent file in the following format:-

torrent_name.torrent

This is what we need, the issue is however, that this script doesn't seem to be working properly as it will only down loading some of the old .torrent files and not the new ones.

I then tried the current BP script.

This was successful in getting all the torrent (ie, not missing any), however, the file that it down loads is not in the required format, it appears to label the .torrent file as the torrent number as found on mininova, ie, the file will be called 123456789.  My rTorrent client will only look for files ending in .torrent in the watched directory.

So, the issue is is that I do not understand bash scripting enough to be able to modify, and, I am lost to find which bit of the script I need to change in the new script some that the .torrent file it download is in the require format.

Do any of you smart folk understand bash scrippting enough (and in particular the sed functions called) and are then able to get the new script working the way required?

Here is a link to the bashpodder forum with the files required to use (there are three including the actuall sript.

http://lincgeek.org/lincware/viewtopic. … af4fda3d0e

Here is the old BP script I used which outputs the correct .torrent name format;

Code:

#!/bin/bash
# By Linc 10/1/2004
# Find the latest script at http://linc.homeunix.org:8080/scripts/bashpodder
# If you use this and have made improvements or have comments
# drop me an email at linc dot fessenden at gmail dot com
# I'd appreciate it!
#
# This revision by Brian Hefferan 2004/02/06, adding configuration options.
# No warranty. It seems to work for me, I hope it works for you.
# Questions /corrections on the additions by Brian Hefferan can be sent to
# brian at heftone  dot  com

#default values can be set here. Command-line flags override theses.
verbose=
wget_quiet='-q'  #default is -q
wget_continue=
catchup_all=
first_only=
unix2dos=
usetorrents=
sync_disks=
fetchlist='bp.conf'

function usage
{
  echo "
Usage: $0 [OPTIONS]
Options are:
-v, --verbose          display verbose messages. Also enables wget's continue
                      option.
--catchup_all          write all urls to the log file without downloading the
                      actual podcasts. This is useful if you want to subscribe
                      to some podcasts but don't want to download all the back
                      issues. You can edit the podcast.log file afterwards to
                      delete any url you still wish to download next time
                      bashpodder is run.
--first_only           grab only the first new enclosed file found in each feed.
                      The --catchup_all flag won't work with this option. If
                      you want to download the first file and also permanently
                      ignore the other files, run bashpodder with this option,
                      and then run it again with --catchup_all.
-bt --bittorrent       launch bittorrent for any .torrent files downloaded.
                      Bittorrent must be installed for this to work. The
                      the script and bittorrent process will continue running
                      in the foreground indefinitely. You can use ctr-c to
                      kill it when you want to stop participating in the
                      torrent.
--sync_disks           run the "sync" command twice when finished. This helps
                      makes sure all data is written to disk. Recommended if
                      data is being written directly to a portable player or
                      other removable media.
-u, --url_list         ignore bp.conf, instead use url(s) provided on the
                      command line. The urls should point to rss feeds.
                      If used, this needs to be the last option on the
                      command line. This can be used to quickly download just
                      a favorite podcast, or to take a few new podcasts for a
                      trial spin.
-h, --help             display this help message

"
}

if [ -n "$verbose" ]; then wget_quiet='';wget_continue='-c';fi
if test -f urls.temp;then rm urls.temp;fi

# Make script crontab friendly:
cd $(dirname $0)

while [ "$1" != "" ];do
   case $1 in
             -v|--verbose ) verbose=1
                            wget_continue='-c'
                            wget_quiet=''
                         ;;
            -u|--url_list ) shift
                            while [ "$1" != "" ];do
                               echo "$1" >> urls.temp
                               shift
                            done
                            if test ! -f urls.temp
                               then
                                   echo "Error: -u or --url_list option specified, but no urls given on command line. quitting."
                                   exit 1;
                            fi
                            fetchlist='urls.temp'
                         ;;
            --catchup_all ) catchup_all=1
                         ;;
             --first_only ) first_only=1
                         ;;
             --bittorrent ) usetorrents=1
                         ;;
             --sync_disks ) sync_disks=1
                         ;;
                -h|--help ) usage
                            exit
                         ;;
   esac
   shift
done

# datadir is the directory you want podcasts saved to:
datadir=$(date +%Y-%m-%d)

# Check for and create datadir if necessary:
if test ! -d $datadir
      then
      mkdir $datadir
fi

if test ! -f bp.conf && test ! -f urls.temp;
then
   echo "Sorry no bp.conf found, and no urls in command line. Run $0 -h for usage."
   exit
fi

# Read the bp.conf file and wget any url not already in the podcast.log file:
while read podcast
      do
      seenfirst=
      if [ -n "$verbose" ]; then echo "fetching rss $podcast...";fi;
      for url in $(wget -q "$podcast" -O - | tr '\r' '\n' | tr \' \" | \
                   sed -n 's/.*url *= *"\([^"]*\)".*/\1/p' )
              do
          if [ -n "$first_only" ] && [ -n "$seenfirst" ]; then break;fi
          echo $url >> temp.log
          if [ -n "$catchup_all" ];
          then
              if [ -n "$verbose" ]; then echo " catching up $url...";fi
          elif   ! grep "$url" podcast.log > /dev/null ;
          then
             if [ -n "$verbose" ]; then echo "  downloading $url...";fi
             wget $wget_continue $wget_quiet -P $datadir "$url"
          fi
          seenfirst=1
      done
done < $fetchlist

if test ! -f temp.log && [ -n "$verbose" ];then echo "nothing to download.";fi

if test -f urls.temp; then rm urls.temp;fi

# Move dynamically created log file to permanent log file:
cat podcast.log >> temp.log
sort temp.log | uniq > podcast.log
rm temp.log

# Use bittorrent to download any files pointed from bittorrent files:
if [ "$usetorrents" ]
then
    if ls $datadir/*.torrent 2> /dev/null
    then
          btlaunchmany.py $datadir
    fi
fi

# Create an m3u playlist:
ls -1rc $datadir | grep -v m3u > $datadir/podcast${datadir}.m3u
if [ -n "$unix2dos" ];then unix2dos $datadir/podcast${datadir}.m3u;fi;

if [ -n "$sync_disks" ]
then
    if [ -n "$verbose" ]; then echo "running sync..";fi;
    sync
    if [ -n "$verbose" ]; then echo "running sync again..";fi;
    sync
fi

if [ -n "$verbose" ]; then echo "done.";fi;

New BP script used which gets all .torrent but does no name them in the correct format.

Code:

#!/bin/bash
# By Linc 10/1/2004
# Find the latest script at http://lincgeek.org/bashpodder
# Revision 1.21 12/04/2008 - Many Contributers!
# If you use this and have made improvements or have comments
# drop me an email at linc dot fessenden at gmail dot com
# I'd appreciate it!

# Make script crontab friendly:
cd $(dirname $0)

# datadir is the directory you want podcasts saved to:
datadir=$(date +%Y-%m-%d)

# create datadir if necessary:
mkdir -p $datadir

# Delete any temp file:
rm -f temp.log

# Read the bp.conf file and wget any url not already in the podcast.log file:
while read podcast
    do
    file=$(xsltproc parse_enclosure.xsl $podcast 2> /dev/null || wget -q $podcast -O - | tr '\r' '\n' | tr \' \" | sed -n 's/.*url="\([^"]*\)".*/\1/p')
    for url in $file
        do
        echo $url >> temp.log
        if ! grep "$url" podcast.log > /dev/null
            then
            wget -t 10 -U BashPodder -c -q -O $datadir/$(echo "$url" | awk -F'/' {'print $NF'} | awk -F'=' {'print $NF'} | awk -F'?' {'print $1'}) "$url"
        fi
        done
    done < bp.conf
# Move dynamically created log file to permanent log file:
cat podcast.log >> temp.log
sort temp.log | uniq > podcast.log
rm temp.log
# Create an m3u playlist:
ls $datadir | grep -v m3u > $datadir/podcast.m3u

Last edited by lambretta (2009-03-09 05:08:21)

Offline

 

#4 2009-03-09 09:54:06

JediSthlm
Member
Registered: 2009-01-17
Posts: 19

Re: rTorrent and RSS feeds - a solution and its light!

@lambretta: no problemo, I just got a daughter so I might know how you feel smile I'm not a bash scripter so I hope someone else can help us out here.

Offline

 

#5 2009-05-23 18:52:46

lambretta
Member
Registered: 2009-01-07
Posts: 27

Re: rTorrent and RSS feeds - a solution and its light!

SUCCESS!

I have been able to use bashpodder as an RSS feed reader for a favourite RSS feed from, in my case mininova (yes I know, Boooo), however, it can be customised any way you wish.

A quick run down on how this works.

1. cron is required to call the bashpodder script.
2. Bashpoder script follows the URL's that you have in the conf file and downloads the most recent .torrent files to the directory of your choice.
3. rTorrent see a new file in the watched directory and starts downloading automagically.

Firstly, the script I am using is as follows...

Code:

#!/ffp/bin/bash
# By Linc 10/1/2004
# Find the latest script at http://lincgeek.org/bashpodder
# Revision 1.21 12/04/2008 - Many Contributers!
# modified by LAMBRETTA for use with rTorrrent on the DNS-323 on 230509
# If you use this and have made improvements or have comments
# drop me an email at linc dot fessenden at gmail dot com
# I'd appreciate it!

# Make script crontab friendly:
cd $(dirname $0)

# datadir is the directory you want the .torrent files saved to, change it here:
datadir=/path/to/rtorrent/watched/directory

# create datadir if necessary:
#mkdir -p $datadir

# Delete any temp file:
rm -f temp.log

# Read the bp.conf file and wget any url not already in the podcast.log file:
while read podcast
        do
        file=$(xsltproc parse_enclosure.xsl $podcast 2> /dev/null || wget -q $podcast -O - | tr '\r' '\n' | tr \' \" | sed -n 's/.*url="\([^"]*\)".*/\1/p')
        for url in $file
                do
                echo $url >> temp.log
                if ! grep "$url" podcast.log > /dev/null
                        then
                        wget -t 10 -U BashPodder -c -q -O $datadir/$(echo "$url" | awk -F'/' {'print $NF'} | awk -F'=' {'print $NF'} | awk -F'?' {'print $1'}) "$url"
                fi
                done
        done < bp.conf
# Move dynamically created log file to permanent log file:
cat podcast.log >> temp.log
sort temp.log | uniq > podcast.log
rm temp.log

Nice and small and light huh!

In short its been slightly modified from the original.  You must also grab the bp.conf file and the other 3rd file from here... http://lincgeek.org/bashpodder/

Put these folders in a directory and modify the bp.conf file to contain your favourite RSS feed, in my case I used the following feed because I like Top Gear

Code:

http://www.mininova.org/rss.xml?user=FinalGear&num=1

What's important about the above is the last bit of it "num=1".  This will only download the last feed posted, it will not get any back issues so to speak.  If you were to put the feed in without the num=1 it will down load a stack of torrent files and rTorrent will start to down load them all.  If you do not have the option to choose to just down load the most recent torrent then I would suggest you turn off rTorrent and run the bashpodder script then delete the files it down loads from the watched directory before you restart rTorrent, this will populate the podcast.log file and as a result will not down load them again.

Things you need to do to get it working....

1  You need to be using bash as your shell, so you need to install bash on your NAS.
2  You need to change the directory that it saves the torrent file in to suit your setup
3  rTorrent must be watching a directory for new torrents and the new files must be anything, i.e. * as opposed to *.torrent.  This is because this script doesn't append the file it downloads with .torrent, it come down as a number (at least for my example). The rtorrent.rc will need to be altered just a little.
4  Set up a cron job on your NAS to kick the script off as often as you would like (I have not yet done this bit but assume its fairly simple)
5  Sit back and enjoy your automated rTorrrent, RSS, NAS goodness.

Let me know if you have any questions.

http://depannone.com/wordpress/?p=44

Lambretta

Last edited by lambretta (2009-05-26 11:27:13)

Offline

 

#6 2009-05-26 06:28:36

lambretta
Member
Registered: 2009-01-07
Posts: 27

Re: rTorrent and RSS feeds - a solution and its light!

lambretta wrote:

4  Set up a cron job on your NAS to kick the script off as often as you would like (I have not yet done this bit but assume its fairly simple)

Pfft - famous last words indeed.  It turned out to be a real pain to figure out (at least for me) why this script just wouldn't run from a cron job.

I could run the script no problems from the command line, however, cron would just not run it.  It turns out cron uses its own set of environmental variables when it runs things, as it happens the env's are different to the one that are now my env's, in particular the shell its tries to run the script with is still /bin/sh.  So you need a work around.

The way I got around this was by passing my env variables to a file in my home directory, I called it .profile.  I then added the following line to the start of the bashpodder.shell script ...

Code:

. /path/to/your/.profile

Obviously you will need to change the path to your .profile file, NOTE WELL that there is a "." then and <space> before the path to your .profile.

Its seems to run very nicely, now I just need to figure out how the heck to overcome some problems that will rise if finalgear stop posting their torrents to mininova one day, lets hope they don't.

Next on my list is implementing a script to auto write this job to crontab when I reboot, I guess it will be similar to the rsync disk A to disk B once a night solution.

Offline

 

#7 2009-05-27 22:16:20

zimer
Member
Registered: 2008-09-06
Posts: 16

Re: rTorrent and RSS feeds - a solution and its light!

Did you look at rssdler?

Offline

 

#8 2009-05-28 04:48:21

lambretta
Member
Registered: 2009-01-07
Posts: 27

Re: rTorrent and RSS feeds - a solution and its light!

zimer wrote:

Did you look at rssdler?

From memory I did look at rssdler but it required Python and another thing called feedparser, I thought this was a fair bit to install just to get an RSS feed downloaded, bashpodder on the other had is just a simple script needing nothing that isn't already installed with ffp (except the bash shell).

Offline

 

#9 2009-05-28 17:10:13

zimer
Member
Registered: 2008-09-06
Posts: 16

Re: rTorrent and RSS feeds - a solution and its light!

Surprising python isn't a ffp package. Feedparser is just another python file the author of rssdler uses to pull the feeds so it's just python you'd need.

Offline

 

#10 2009-05-29 06:54:52

lambretta
Member
Registered: 2009-01-07
Posts: 27

Re: rTorrent and RSS feeds - a solution and its light!

zimer wrote:

Surprising python isn't a ffp package. Feedparser is just another python file the author of rssdler uses to pull the feeds so it's just python you'd need.

If you get rssdler going let us know.

Offline

 

Board footer

Powered by PunBB
© Copyright 2002–2010 PunBB