Unfortunately no one can be told what fun_plug is - you have to see it for yourself.
You are not logged in.
KyleK wrote:
How exactly have you set Transmission to not start downloading new torrents?
...
This means even if you set the limit, adding a torrent either via web interface or Automatic will ignore your "do not start" preference.
There is a preference in the Clutch interface - "Start transfers when added". I just do not check it. When I upload via clutch, the torrents are added in paused mode and I have to click start on them before they will begin downloading data. Perhaps the clutch source code would be more revealing.
Offline
You're right, Clutch has that preference as well. It basically does what I mentioned before. If that option is not checked, when you upload a torrent it sends the argument "paused: true" together with the torrent.
I can do the same with Automatic, but that means I'd have to add another option to the config file. Not that this is a big deal, but it somewhat deviates from the intended purpose of the tool
There is already a global preference "start-added-torrents" that is used by GTK (and probably the Mac version). If you set this in one of these builds, the setting is written to settings.json.
I'm going to propose a patch to the Transmission team to add that functionality to the daemon and CLI as well.
If that is rejected, I'll add an option to Automatic.
All I ask is to be patient
Offline
Dont worry, its a minor thing. It only came up because the very first video Automatic downloaded for me was nuked a few hours later so it downloaded the fixed version as well, and I have bandwidth limits so want to avoid that sort of thing as much as possible. Thanks again for your efforts on porting transmission to the nas, they are much appreciated.
Offline
KyleK wrote:
secrice wrote:
Hi,
Please check this URL. http://rss.thepiratebay.org/101
Automatic 0.3 displays, "Is this really a torrent feed?"
Can't I use this URL?The feed should work just fine with Automatic. That message is a debug message I forgot to remove.
I make a mistake setting TRANSMISSION_HOME directory in /ffp/start/automatic.sh
After I fix it, Automatic is Working FINE! Thank you very much.
Offline
I have this pattern:
patterns = { "(?!.*.mkv).*"
".*the it crowd*"
".*the soup*"
".*south park*"
}
Will this make sure that in my RSS feed it will download every .mkv file as well as the shows for which there is no HR version, like The IT Crowd, The Soup and South Park?
Offline
The regular expression to match any .mkv file would be ".*\.mkv"
For the other shows I would use these expressions:
"The.IT.Crowd"
"The.Soup"
"South.Park"
(The filter is case-insensitive, so you may as well use all lower letters).
The reason I use the dot ('.) instead of a whitespace is because whitespaces don't work very well in regular expressions.
If you definitely know that there is a whitespace between two words, you can use '\s' in its place.
Regarding the mkv filter: Automatic only checks against the <name> tag of an RSS item, so this filter will only work if the actual filename of an episode is placed in that tag.
You can easily check this: Download RSS feed on your hard drive and rename the file to feed.xml. Then open it.
It will show you a list of <item> tags, and each item has subtags like <name>, <link> and <pubDate>.
Automatic only looks at the <name> tag.
Offline
bearxor wrote:
I have this pattern:
patterns = { "(?!.*.mkv).*"
".*the it crowd*"
".*the soup*"
".*south park*"
}
Will this make sure that in my RSS feed it will download every .mkv file as well as the shows for which there is no HR version, like The IT Crowd, The Soup and South Park?
Similar question as bearxor
Would something like this
patterns = { "(?!.*(AC3|HR|264|720|French|Hebrew|Portuguese|Brazilian|vostfr))" "the.it.crowd" "the.soup" "south.park" }
Download the episodes for the it crowd, the soup, and south park but at the sametime block all AC3, HR, 264, 720, etc from being downloaded or would I have to prefix all my episodes with the exclusion?
patterns = { "(?!.*(AC3|HR|264|720|French|Hebrew|Portuguese|Brazilian|vostfr))the.it.crowd" "(?!.*(AC3|HR|264|720|French|Hebrew|Portuguese|Brazilian|vostfr))the.soup" "(?!.*(AC3|HR|264|720|French|Hebrew|Portuguese|Brazilian|vostfr))south.park" }
Thanks
Update:
Just tried it and it tries to download everything not matching the exclusions. Is there a way to do this without prefixing everything or a way to set the global exclusions into a variable and prefix it with all the tv shows?
Last edited by ZeroFill (2008-12-29 09:01:19)
Offline
Hi,
I'm trying to use the automatic binary to download .nzb files from a yahoo pipe.
I get the following message:
Found new download: the usual suspects (http://www.binsearch.info/?action=nzb&33455941=1)
[getHTTPData] Failed to download 'http://www.binsearch.info/?action=nzb&33455941=1' [response: 302]
I'm not sure what the problem is here, it might be that the url first redirects to an https address and automatic has some difficulties with it?
Regards,
Maarten
Offline
ZeroFill wrote:
bearxor wrote:
I have this pattern:
patterns = { "(?!.*.mkv).*"
".*the it crowd*"
".*the soup*"
".*south park*"
}
Will this make sure that in my RSS feed it will download every .mkv file as well as the shows for which there is no HR version, like The IT Crowd, The Soup and South Park?Similar question as bearxor
Would something like thisCode:
patterns = { "(?!.*(AC3|HR|264|720|French|Hebrew|Portuguese|Brazilian|vostfr))" "the.it.crowd" "the.soup" "south.park" }Download the episodes for the it crowd, the soup, and south park but at the sametime block all AC3, HR, 264, 720, etc from being downloaded or would I have to prefix all my episodes with the exclusion?
Code:
patterns = { "(?!.*(AC3|HR|264|720|French|Hebrew|Portuguese|Brazilian|vostfr))the.it.crowd" "(?!.*(AC3|HR|264|720|French|Hebrew|Portuguese|Brazilian|vostfr))the.soup" "(?!.*(AC3|HR|264|720|French|Hebrew|Portuguese|Brazilian|vostfr))south.park" }Thanks
Update:
Just tried it and it tries to download everything not matching the exclusions. Is there a way to do this without prefixing everything or a way to set the global exclusions into a variable and prefix it with all the tv shows?
Nope, there is no global exclusion or inclusion prefix. If your feed contains that many variations of a release, and you only want one specific kind, you will have to prefix every item in your patterns list with the above exclusion string.
Rather than extending the exclusion string, why don't you try to make the pattern more specific? Since you're excluding "264" and "720", I'm assuming you want plain XviD releases. Use that after the episode title and you can strike 2 items from the exclusion pattern.
You should also see if you can use a different feed for Automatic, one that doesn't contain ALL the releases.
Offline
Maartenvdv wrote:
Hi,
I'm trying to use the automatic binary to download .nzb files from a yahoo pipe.
I get the following message:
Found new download: the usual suspects (http://www.binsearch.info/?action=nzb&33455941=1)
[getHTTPData] Failed to download 'http://www.binsearch.info/?action=nzb&33455941=1' [response: 302]
I'm not sure what the problem is here, it might be that the url first redirects to an https address and automatic has some difficulties with it?
Regards,
Maarten
I haven't put any code into Automatic that deals with redirection, so it is not surprising that this fails. I'll try to come up with a solution and will let you know.
Offline
I have to say I love where you are going with Automatic. It has greatly simplified how I go about grabbing my tv shows on the DNS.
I was wondering if you could add some code that would facilitate sorting? I think that I may have a way to go about making this happen with little trouble.
Once the torrent has been started, it can be paused, moved to a new location, and a symbolic link that points to the file (now in the desired folder) created in the default torrent folder. Once this is done the torrent can be restarted and it will now be saved in desired folder. This method works for certain. It may also be possible to simply create an empty file in the desired final location and create a symbolic link with the name of the file/root folder being downloaded in the default torrent folder.
This should be relatively easy to implement and could pave the way for some more complex sorting.
Cheers.
Offline
Updated Automatic to v0.3.1. This version fixes the issue with URL redirects that Maartenvdv reported (see below).
As always, I've updated the links to the new package in the first post of this thread.
Maartenvdv wrote:
Hi,
I'm trying to use the automatic binary to download .nzb files from a yahoo pipe.
I get the following message:
Found new download: the usual suspects (http://www.binsearch.info/?action=nzb&33455941=1)
[getHTTPData] Failed to download 'http://www.binsearch.info/?action=nzb&33455941=1' [response: 302]
I'm not sure what the problem is here, it might be that the url first redirects to an https address and automatic has some difficulties with it?
Regards,
Maarten
Offline
KyleK wrote:
The history list is supposed to gro as large as there are items in your feed(s). 10 is just the default value. It's hard to say why it doesn't get larger.
Try running with argument "-v 2" for more verbose output.
lividhatter wrote:
when i run automatic in verbose mode it eventually says [add_to_bucket] bucket gets too large (10), deleting head item...
then nothing
It seems like its not detecting duplicates for some reason. What detail do you need to help me get this working?
This happens to me too. I've run with -v 2 but all it says is bucket gets too large and deleting the head item. my automatic.state only stores 10 files, so I keep downloading torrents many times. Pls help. Anyway of recompiling with default bucket size larger, say 100?
Offline
How many items does the feed contain that you download from? If you have multiple feeds, add the item counts together.
Your bucket should become that size.
I'll look into this when I'm back at home.
I don't have the source code in front of me, but this workaround should help you for now:
Close Automatic, and add as many lines to 'automatic.state' as you want items. the content of the lines does not matter.
Don't go overboard though, because all those lines are kept in memory while the program is runnnig. (I have a bucket of 40).
Automatic should use the size of the history file as new bucket size (I think, but I'm not sure).
Also, please send me your logfile, preferrably when Automatic is running with the commandline parameter '-v 3'.
You can delete/obfuscate all URLs and logins/passwords.
Offline
@lividhatter and @oxygenoxy
I just uploaded the new version 0.3.2 that should fix the problem with duplicate downloads you were experiencing. Let me know if any issues occur.
The new version also brings an additonal option "start-torrents".
Details & download can be found, as always, in the updated first post of this thread.
Offline
Nice work
If if saw this some weeks ago, i would saved myself of doing my own little automatic downloader, starting with a lame shell script to just
get all the torrents links from a private tracker who didnt have rss feeds, and 'echo' out a rss feed then saved on the dns323 so utorrent
could access it from dns's webserver. But the features grow and the more you work on something the more complex it gets, and the shortcuts and stupid ways of doing things in the beginning when there where no plans, gets you in the end now im pretty fed up with complex
grep and sed lines to filter in and out stuff and get things done, and i'm using 3 computers to get everything working, utorrent on a vista
machine for the torrenting, dns323 for the scripting, and MorphOS running a little arexx scripted webserver for sending me a sms with info about what torrents thats been added, i tried mailx/sendmail but never got it working (my email provider can send email as sms messages)
I would like to use Automatic instead of rewriting my approach, but there are some things i would like added, as
downloading torrents from private trackers with cookies support, one site i have uses 3 login options, uid passwd and pincode so
im using wget to download .torrents locally, i made cookies.txt (exported from firefox cookies) and just add --load-cookies option. A similar approach/solution in Automatic would be helpful.
I also added to check for how long the torrent have been seeded, getting the date from locally saved .torrent file and if its older
that say 7 days, it sees in utorrent if same filename matching its there, and gets the hash id from utorrent and removes+deletes data
using utorrent webui commands,like webuicmd="http://192.168.0.66:8080/gui/?action=removedata&hash="$hash and then wget -q $webui
and its gone, keeps the torrent client neat and clean, no maintenance There are options in utorrent to keep seeding until a certain ratio have been made but not an option to auto-remove the torrent from client. i have no clue if maybe transmission supports this...
And incase some filter fails or some show started without adding the filter, it makes 2 new rss feeds stored on the webserver, one lists
all the torrents thats automaticly downloaded, and another rss feed for all the releases(no filters) with a new link inside, loading this link from the rss feed will start a script on the dns323 webserver that download the torrent locally, and adds the torrent to utorrent, i have these feeds running on my sony ericsson so i can start downloading torrents or movies remotely from work
But mainly downloading torrents from private trackers with cookies support or similar would be great. I could write a new script that
just adds and removes torrents in utorrent with webui commands using torrents fetched by Automatic, many trackers usally ban these small torrent clients
Last edited by catohagen (2009-02-12 02:44:05)
Offline
Wow, that's quite a post you wrote there. You should consider using proper punctuation however, it was quite hard to read
The cookie thing has been sitting on my to-do list for a while, unfortunately I haven't had time for any more research yet. I still intend to implement it though, so just be patient.
The other stuff seems fairly particular to me. Not everyone has utorrent running (why would they?), so the interaction between Automatic/Transmission and uTorrent would be a lot of effort for maybe only one person. I think you'll have more luck with writing a script for that.
I want to keep Automatic lean and mean, sticking to its single assigned task: Download RSS feeds, check their contents against provided filters and add torrents to Transmissions for matching patterns.
Resources are scarce on the NAS (the recent versions of Transmissions use a lot of memory), so I want to keep Automatic on a low profile.
Anyways, if you really can't live without some of the stuff, the source code is available, so you can have a try at enhancing the tool yourself.
Offline
thanks for reply, and i'll watch this thread patienty, if you some time implement this cookie feature.
The utorrent stuff is ofcource not needed, i was hoping for a easyer approach to replace my current 3 computer torrent system, i saw this thread and your kickass utility and thought 'DOH' after i spent weeks trying to write some script to do this
The email notification would be a nice feature (atleast for me ), if its not a too big problem, if it could send an email once a day or something, with info on what torrents matched the filters, if anyone else agree with me, raise you hand
anyhow, its been a good learning in shell scripting, and i'm looking forward to replace my torrent fetching with Automatic. The scripts for
adding them to utorrent or other clients is easyer to do, im having problems with the getting of torrent files with filters, and keeping track
of whats downloaded and whats not
Last edited by catohagen (2009-02-13 01:14:39)
Offline
Hi
I'm using chroot debian and I'm trying to compile the source for automatic. When I execute the configure script, I get an error
checking for LIBCURL... configure: error: Package requirements (libcurl >= 7.15.0) were not met: No package 'libcurl' found Consider adjusting the PKG_CONFIG_PATH environment variable if you installed software in a non-standard prefix. Alternatively, you may set the environment variables LIBCURL_CFLAGS and LIBCURL_LIBS to avoid the need to call pkg-config. See the pkg-config man page for more details.
I have the libcurl3 7.17.1-1 package installed, so I'm not sure why it is giving this error. Can anyone help?
Offline
Check if there's a file /usr/lib/pkgconfig/libcurl.pc (it might be /usr/local, I'm not sure)
That's how the configure script tries to find libcurl. If the file doesn't exist, the above message is shown.
The file will only be available if you have installed the dev package of libcurl (including header files, which are needed for compilation).
If you're certain that you have the dev build, but configure still won't find it, you can use the above mentioned environment variables:
./configure LIBCURL_CFLAGS=-I/usr/include/curl LIBCURL_LIBS="-L/usr/lib -lcurl"
Offline
Hi KyleK!
I hava a question abut your great script.
I would like to download torrent the following way:
I have a directory called "torrent" and when I put a new torrent file to it Transmission automatical start the downloading.
Can I do it with your script?
Best regards!
Offline
Hi Cirip,
Automatic actually isn't a script, it's a native application written in C.
The functionality you describe was requested before (it's called "watchfolder" functionality), and I once tried to implement it. At the time, the NAS didn't support the proper features on the OS level, so I abandoned it.
With later firmware updates, the necessary OS features are now available, but I'm still missing some development tools to support this.
It's definitely on the To-Do list, but as of now I can't implement it.
On another note, this feature might also be implemented directly into Transmission soon. But, for this to work on the NAS, I'm still gonna need the development files, and I hope the NAS people will provide them shortly.
Offline
Thanks your Quick and prof answer!
And what do you think are there any other way to replace the "watchdog" feature now?
Offline
I believe there are scripts floating around, that provide watchfolder functionality.
Basically, they look at given intervals at a provided directory, and if there are any (torrent) files, they add them to Transmission.
Check the forums, I'm sure I've seen this stuff (the Transmission thread has some mentioned).
Offline
To Cirip:
http://www.horto.ca/?p=10 , look after " 6. AUTO-TORRENT-DOWNLOADING! "
I have done this, and it works just fine, it will add the torrent automatic every hour.
Offline