When a download is being verified/repaired/unpacked it takes an enormous amount of disk time and any disk access intensive application will be slowed down majorly.
This is an issue for me. I do however have loads of RAM in my rig so I thought that you could process the smaller downloads (let's say < 10 GiB) entirely in RAM to speed things up and only save the files when they are being unpacked.
Larger downloads could be kept in RAM partially. I thought that this is what the “Article Cache Limit” option was for but even after setting it to 4G (I figured that a 32bit application couldn't use more anyway) the SABnzbd process never uses more than 200 MiB of RAM (that is the entire working set—shareable and private).
I realize that there are only a few people who would use this option but the basic implementation might be quite easy. I haven't reviewed the source code but I imagine that you're using some kind of buffer for writing to files. All you'd need to do would be to blow that buffer up and only write to disk when it is full (as opposed to writing when an article download is complete, which I assume is what you do now).
Thanks in advance for considering this.
Handle smaller downloads in RAM
Re: Handle smaller downloads in RAM
Not useful.
The external programs par2cmdline and unrar can only handle disk files (we didn't write those).
A RAM disk might be a better solution, but that would be problematic for big posts.
O.t.o.h. the bigger the post, the bigger the issue you describe.
The external programs par2cmdline and unrar can only handle disk files (we didn't write those).
A RAM disk might be a better solution, but that would be problematic for big posts.
O.t.o.h. the bigger the post, the bigger the issue you describe.

