HI Guys,
I realize that the post processing script will run in the download directory, after the files have been downloaded, repaired, unrared and then moved to the download directory.
I am having a problem with the way the sripts currently run.
I use xbmc as well as a 3rd party program to create nfo, jpg, & tbn files for each of my tv show episodes scraped from thetvdb.
I also am using a download directory cleanup script that deletes the extra nzb, nfo, jpg, and other files that usually aren't deleted automatically by sabnzbd after the show has been unpacked to the download folder.
The problem I am having, is that when the download directory cleanup runs, it will not only delete the nfo and jpg files that I don't want, but also the nfo and jpg files that are scraped by the xbmc tool.
Is there a way to have the post processing script run after the show is downloaded and unpacked, but still in the unprocessed downloads folder?
Question on when a post processing script will run
-
feerlessleadr
- Jr. Member

- Posts: 64
- Joined: April 30th, 2009, 12:09 pm
Re: Question on when a post processing script will run
Files are unpacked directly to the complete drive so your request would probably not work.
I would use a .cmd script to the the file cleanup instead of sabnzbd cleanup (there should be an example on on this forum) then run whatever code/program you would normally run in that .cmd script.
I would use a .cmd script to the the file cleanup instead of sabnzbd cleanup (there should be an example on on this forum) then run whatever code/program you would normally run in that .cmd script.
-
feerlessleadr
- Jr. Member

- Posts: 64
- Joined: April 30th, 2009, 12:09 pm
Re: Question on when a post processing script will run
I m not quite sure on how to accomplish your suggestion.
I am running the download directory cleaner located in this forum as a post processing script and I am not using the cleanup built into sabnzbd.
I believe you are suggesting that I also run my xbmc scraper program after the cleanup script has been run.
That is basically what I am doing now, which is the problem.
all of the nfo's that the program has already created for each of my video files are being deleted and need to be re-downloaded by my scraper program.
Is there anyway to tell sabnzbd to not download certain files for certain categories?
I am running the download directory cleaner located in this forum as a post processing script and I am not using the cleanup built into sabnzbd.
I believe you are suggesting that I also run my xbmc scraper program after the cleanup script has been run.
That is basically what I am doing now, which is the problem.
all of the nfo's that the program has already created for each of my video files are being deleted and need to be re-downloaded by my scraper program.
Is there anyway to tell sabnzbd to not download certain files for certain categories?
-
doubledrat
- Release Testers

- Posts: 180
- Joined: February 20th, 2008, 3:16 pm
Re: Question on when a post processing script will run
does the scraper monitor the download folder and do it's stuff automatically? if not, I can't understand why you would be having a problem i.e. if your postproc job does
cleanup directory
then run scraper
how can the cleanup step delete files from the scrape step?
cleanup directory
then run scraper
how can the cleanup step delete files from the scrape step?
-
feerlessleadr
- Jr. Member

- Posts: 64
- Joined: April 30th, 2009, 12:09 pm
Re: Question on when a post processing script will run
Well I was able to come up with a work around last night.
Basically I am using the default sabnzbd cleanup to delete the nfo & jpg files that are downloaded with the tv show, which also seems to leave the nfo and jpg files created by the scraper alone.
Basically, after sabnzbd downloads teh tv show i manually run the scraper on the download folder to download the associated episode information and episode thumb in the form of an nfo file and jpg file.
The problem I ran into, is that the scraper program won't pick up the new episode if an nfo already exists (which is the case when I didn't run the cleanup script).
The other problem is that when I did run the cleanup script, it would not only delete the nfo and jpg file that was included with the tv show when I downloaded it, but it would also delete the previously scraped nfo and jpg files for the other tv episodes.
So basically I have a tv show folder setup similar to this:
TV\
----Season 1\
---------------House - 1x01.mkv
---------------House - 1x01.nfo
---------------House - 1x01.jpg
The nfo and jpg files above are the files created by the scraper program during a previous use (so lets say i ran the program yesterday, that is when those files were created.
However, when I download episode 2 of House, and run the cleanup directory script, the script not only deletes the scene nfo's and jpg's, but also the previously scraped nfo and jpg.
Hopefully that made some sort of sense.
I guess another potential solution is to have the option to run a post processing script in the UNPACK folder and then move the contents of the unpack folder to the download directory. Just a thought.
Basically I am using the default sabnzbd cleanup to delete the nfo & jpg files that are downloaded with the tv show, which also seems to leave the nfo and jpg files created by the scraper alone.
Basically, after sabnzbd downloads teh tv show i manually run the scraper on the download folder to download the associated episode information and episode thumb in the form of an nfo file and jpg file.
The problem I ran into, is that the scraper program won't pick up the new episode if an nfo already exists (which is the case when I didn't run the cleanup script).
The other problem is that when I did run the cleanup script, it would not only delete the nfo and jpg file that was included with the tv show when I downloaded it, but it would also delete the previously scraped nfo and jpg files for the other tv episodes.
So basically I have a tv show folder setup similar to this:
TV\
----Season 1\
---------------House - 1x01.mkv
---------------House - 1x01.nfo
---------------House - 1x01.jpg
The nfo and jpg files above are the files created by the scraper program during a previous use (so lets say i ran the program yesterday, that is when those files were created.
However, when I download episode 2 of House, and run the cleanup directory script, the script not only deletes the scene nfo's and jpg's, but also the previously scraped nfo and jpg.
Hopefully that made some sort of sense.
I guess another potential solution is to have the option to run a post processing script in the UNPACK folder and then move the contents of the unpack folder to the download directory. Just a thought.
-
doubledrat
- Release Testers

- Posts: 180
- Joined: February 20th, 2008, 3:16 pm
Re: Question on when a post processing script will run
it sounds like the cleanup script is not just cleaning the directory of the just downloaded show, but all directories in your download area. at least you have a working solution now "if it aint broke, don't fix it
"
-
feerlessleadr
- Jr. Member

- Posts: 64
- Joined: April 30th, 2009, 12:09 pm
Re: Question on when a post processing script will run
well it is cleaning the download director, because the season sub folder is the download directory.
but I agree. It aint broke now, so I'm not going to worry about fixing it.
:-)
but I agree. It aint broke now, so I'm not going to worry about fixing it.
:-)
