Deleted the contents of my Completed Download folder because I was having issues importing files due to the 260 characters limit and then I noticed this...
Thanks again Puzzled, I'll watch my queue for a bit and report back.
Installed and tested. 200MB=>100MB=>2MB; with a restart after each change; resulted in no noticeable change in behavior. I quit the first three tries after 1000MB of Missing Articles; the 4th attempt (8000MB NZB @ 2MB Threshold) I let complete and failed after nearly 10min. Can we make the ratio 5% ...
I was using the default UsenetServer @ 20 connections; tried the US-UsenetServer @ 20 connections with no change. I am only able to download approximately 1TB per day on average; I should be able to get closer to 4TB. Is it possible to add a "Missing Article Limit" to the switches page? It...
Any insights to my issues? Here's a screenshot of my queue after it encounters a number of consecutive "hanging" nzbs; note the severe droughts of data in between attempts.
I'm currently having to stalk my queue and remove the failing downloads; while sending the good ones to the bottom. I am averaging 80MB/s for throughput, so every minute I'm not on a good nzb I'm losing roughly 5GBs of data. The pattern I am noticing with the nzbs that are "hanging" is tha...
Using 3.1.1, with Fast_Fail ON, Abort if cannot be completed ON and Req_Completion on DEFAULT. This 2600MB download below took roughly 3min to nearly complete and then fail. https://i.ibb.co/qxfRnt4/Sab2.png https://i.ibb.co/MVSgjdG/Sab.png Multiply that timeframe for larger files (26GB => 30min) an...
Can we get a new optional switch to "Abort jobs that cannot be completed" that isn't based on percentage of missing articles? My queue stacks up so quickly, that reasonable data capturing in large quantities is not possible; jobs that are very large (20GB+) suck up too much time and data i...