Unfortunately no one can be told what fun_plug is - you have to see it for yourself.
You are not logged in.
A manual is not really required. The posts here are great and with a bit of research on regex expressions easily found by google for those not familiar, you can be up and running in minutes thanks to KyleK's great work.
A few modifications to the conf file now and then to reduce duplications and/or add new files is all that is required.
I LOVE IT!!!
Offline
Thanks FunFiler
That said, if you have any questions regarding the usage, or run into trouble, feel free to post to this thread. I'll help where I can.
Offline
KyleK - I don't suppose you'd be interested in compiling a version of Automatic that appends the log file automatically rather than replace it? Although some built in config options would be nice, this would be an easy option since logrotate can take care of clearing the logs.
I'd love to setup an environment to compile it myself, but I haven't had a chance to figure out all the packages required. Do you need a full Linuix environment or can it be done on the DNS323?
Offline
try
./ffp/bin/automatic -c /ffp/etc/automatic.conf 2>> /mnt/HD_a2/logs/automatic.log
..my 2 cents...
Offline
Goot point. I guess I'm just used to programs that write directly to a log when one is specified on the command line or in a configuration file rather than simply redirecting StdOut and StdErr to the log. I'll give it a try. Thanks
Offline
I'm sorry, I forgot about the initial request, that's why I hadn't responded yet.
I can see why it would be useful (I could've used it myself a couple of times now , but changing the default behaviour to always append instead of overwrite is not the right choice in my opinion. Most people don't care about the logfile (they might not even know one exists), and changing it to append all the time would grow a ginormous log over time
I'll see if I can provide a commandline switch for this kind of behaviour.
Offline
True enough, I just thought it may be faster and easier to change the behaviour and redistribute to those that want it rather than implement a new command line or config file option. Meant as a short term solution only.
Obviously the ideal would be to have rollover by size, date and/or other factors, but really this can be left to a tool such as logrotate.
Last edited by FunFiler (2010-10-17 17:46:07)
Offline
FunFiler wrote:
KyleK - I don't suppose you'd be interested in compiling a version of Automatic that appends the log file automatically rather than replace it? Although some built in config options would be nice, this would be an easy option since logrotate can take care of clearing the logs.
I'd love to setup an environment to compile it myself, but I haven't had a chance to figure out all the packages required. Do you need a full Linuix environment or can it be done on the DNS323?
I compile both Automatic and Transmission directly on the NAS. All the necessary dev tools are available in fonz' package directory. I did have to build some dependencies myself, but I can provide this if you need them.
Offline
I'd love to get any information you have in this regard. List of files, packages, scripts, compiler options, anything you wish to share. I don't want to take anything away from the great work you have been doing, it would just be nice to get an environment running where I could tweak a few things for my own use.
Offline
I updated Automatic to v0.6.4. It fixes a minor issue and also adds a commandline option for the appending logfile.
Download link is as always in the first post.
Let me know if you find any issues.
Offline
FunFiler wrote:
I'd love to get any information you have in this regard. List of files, packages, scripts, compiler options, anything you wish to share. I don't want to take anything away from the great work you have been doing, it would just be nice to get an environment running where I could tweak a few things for my own use.
Well, let me see. How good is your knowledge about compiling source on a Linux machine? Do you already know about configure/makefile etc?
The essential packages you need for compiling source (not only Automatic) would be:
* autoconf-2.61-2.tgz * automake-1.10.1-2.tgz * binutils-2.18.50.0.1-4.tgz * gawk-3.1.6-3.tgz * gcc-4.1-2.tgz * kernel-headers-2.6.9.1-2.tgz * libtool-1.5.24-1.tgz * m4-1.4.10-2.tgz * make-3.81-3.tgz * pkg-config-0.23-1.tgz
I would also install these, they may come in handy now and then:
* flex-2.5.33-5.tgz * bison-2.3-3.tgz * neon-0.25.5-1.tgz * patch-2.5.9-3.tgz * subversion-1.5.2-1.tgz * zlib-1.2.3-3.tgz * tar-1.19-2.tgz * sed-4.1.5-3.tgz * gzip-1.3.12-3.tgz
Automatic specifically depends on 3 libraries, see the first post for download links.
And then you're good to go: Download the source and unpack it.
Most source packages come with a configure script which automatically checks the build environment and sets some variables.
For everything you compile on the NAS, I suggest you use at least this command:
$ ./configure --prefix=/ffp
This ensures that later on when installing the files are copied to the right folder.
Next:
$ make
And you're done :)
Offline
Awesome, thanks! My knowledge is "reasonable". I have about 30 years of programming experience in general. I'll give this a shot based on the great info you provided. I doubt I'll have any questions, but I know where to post if I do.
Thanks also for updating 0.64. I'll drop that in right away!
Offline
A couple of minor additions. In order to be successful in compiling, I had to install 2 more packages:
grep-2.5.3-2.tgz
libiconv-1.12-3.tgz
grep-2.5.3-2.tgz corrected the following error:
configure: error: no acceptable grep could be found in /ffp/sbin:/ffp/bin:/usr/bin:/sbin:/usr/bin:/bin:/usr/xpg4/bin
and libiconv-1.12-3.tgz corrected:
/ffp/include/libxml2/libxml/encoding.h:28:19: error: iconv.h: No such file or directory
and
/ffp/include/libxml2/libxml/encoding.h:136: error: expected specifier-qualifier-list before 'iconv_t'
Next, off to do the same for transmission. Thanks again for the pointers. It was a big help.
Offline
KyleK - In a future release you may want to consider adding
dbg_printf(P_INFO, "Automatic version: %s", LONG_VERSION_STRING);
to main. Having the version info dump to the log may come in handy if you start rolling out updates
Having fun playing around with it. My default initial build (without any code changes) is slightly larger than yours though (~10k). Are you compressing the binary or using some other options to make it smaller?
Offline
I just strip the binary of any debug symbols by usingthe 'strip' command.
(Or rather, I use the command "make DESTDIR=$HOME/devel/releases/Automatic install-strip").
Offline
Hi Kyle:
My NAS recently went through a reboot and now I'm encountering an error in checking for new episodes. I put on lv 3 verbose and here's what got kicked out during the "Checking for New Episodes" step....
[10/11/03 21:32:27] ../src/automatic.c, 560: ------ Checking for new episodes ------
../src/automatic.c, 573: Checking feed 1 ...
../src/web.c, 401: [getHTTPData] url=http://ezrss.it/search/index.php?show_name=Weeds&date=&quality=&release_group=fqm&episode_title=&season=6&episode=&video_format=&audio_format=&modifier=&mode=rss, curl_session=(nil)
../src/web.c, 366: [am_curl_init] Created new curl session 0x312e0
../src/web.c, 431: [getHTTPData] 'http://ezrss.it/search/index.php?show_name=Weeds&date=&quality=&release_group=fqm&episode_title=&season=6&episode=&video_format=&audio_format=&modifier=&mode=rss': Error
../src/automatic.c, 428: [processFeed] curl_session=0x312e0
../src/automatic.c, 573: Checking feed 2 ...
All I know that there is an error, but I cannot troubleshoot since it does not describe the type of error it is. The URL works, the NAS can access the Internet unencumbered, so I do not know what could be the issue. Any suggestions?
Thanks!
Offline
That URL is to a standard HTML page, not a RSS feed. You should be using http://ezrss.it/feed/ instead
Offline
Thanks for the suggestion, but that URL is an RSS feed - a search based RSS feed that's provided by EZRSS.it (if you try to view the URL in a browser, you will see that it's a feed). It has worked for months, until my NAS rebooted. I'm fairly certain the issue is likely local to my environment, but without know what type of error Automatic throws, it's hampering my investigation.
Offline
I guess there is a flag missing then, if I copy and paste it into a browser, it isn't an RSS feed, back on what you posted.
I'd suggest trying the feed I posted anyway, just to see if automatic will parse it properly.
Offline
Darn. I tried it, but with no success...
[10/11/04 14:42:15] ../src/automatic.c, 560: ------ Checking for new episodes ------
../src/automatic.c, 573: Checking feed 1 ...
../src/web.c, 401: [getHTTPData] url=http://ezrss.it/feed/, curl_session=(nil)
../src/web.c, 366: [am_curl_init] Created new curl session 0x31328
../src/web.c, 431: [getHTTPData] 'http://ezrss.it/feed/': Error
../src/automatic.c, 428: [processFeed] curl_session=0x31328
../src/automatic.c, 573: Checking feed 2 ...
Very strange.
Offline
Can i request upgrade to isolate the feed to have different option.
Feed1 have options and filter dedicated for feed1
Feed2 have options and filter dedicated for feed2
Eg as below
feed = { url => "http://your_rss_feed1"
cookies => ""
filter => "Name.of.Something.I.Want.to.Download.Automatically.*HDTV"
folder => "FromFeed1"
upload-limit => "10"
start-torrents => "yes"
}
feed = { url => "http://your_rss_feed2"
cookies => ""
filter => "Name.of.Something.I.Want.to.Download.Automatically.*HDTV"
folder => "fromFeed2"
upload-limit => "5"
start-torrents => "no"
}
Offline
KyleK's site was not working earlier so I've uploaded his great work to an alternative location in case anyone has trouble accessing it.
Here is a link to Automatic v0.6.4 by KyleK
http://www.mediafire.com/file/lcb1ed2rf … .6.4-1.tgz
Offline
First off - thanks KyleK for a great program...been working like a charm for me for a while now....but with the last filter I added I seem to be having some trouble.
I recently tried to add a filter for the walking dead, using the same exact format as all my other filters...
filter = { pattern => "(?!.*720p)(?!.*season)The.Walking.Dead" folder => "/mnt/HD_a2/Media/Television/The.Walking.Dead" }
For weeks now it hasn't picked up any shows although I know they appear in my RSS feed...I checked the log file and sure enough, it's loading 15 filters, not the 16 I have there.
Any ideas?
Offline
Not usre if it really matters or not, but I would simplify the statements combining the exclusions. You can also enable debug logging to see the parsing. Perhaps it really isn't pattern matching because the strings are invalid.
Try this, it works for me (although I've modified it slightly to match your requirements:
filter = { pattern => "(?!.*(720p|season))^The.Walking.Dead" folder => "/mnt/HD_a2/Media/Television/The.Walking.Dead" }
If it isn't loading all your filters, there must be a syntax error somewhere. Specifically, the statement before and after the one for The Walking Dead if indeed that is the one that is not loading. Post you entire file. It shouldn't matter, and I haven't verified, but make sure there is a blank line at the end of the file, perhaps it is necessary. Debug logging should show which filters are loading IIRC. Did you use a Unix friendly editor to make the changes?
Last edited by FunFiler (2010-11-24 14:38:54)
Offline
Thanks - made that update, still no go. How do you enable debugging?
My file is attached.
Offline