![]() G-, t-) with IEC prefix (ki-, mi-, gi- and ti-) To specify in IEC units, replace the SI prefix (k-, m-, Speed if the speed changes, the value will not be Rate as a percentage means a fraction of the current Point number, possibly followed by either a unit (both SIĪnd IEC units supported), or a float followed by a '%'Ĭharacter to specify the rate as a percentage of theĭevice's speed (e.g. Heres an extract of the relevant part of the 'tc' man page to save you the bother of finding it : See the 'tc' man page for which suffixes you can use if you want to limit to kbps or Mbps etc. You can reach the limit_bw script for editing either my exec-ing into the container or my simply editing it in the mounted volume directly from the docker host. To change the allocated bandwidth, just alter the parameters in the limit_bw script, any changes will be applied next time the cron job executed. I set it up as follows to run every 60 seconds, so only have a maximum of that long to wait before any changes to the allocated bandwidth are implemented. The magic happens when you add a cron job on the docker host to run limit_bw at regular intervals. To get the bandwidth limiting working to your requirements, you'll need to do a few things on your docker host: It reads adapter.id and the applies the requested bandwidth limit to the docker virtual ethernet adapter for the container in which nzbget is running.īandwidth limits are rock solid, instead of simply varying the amount of time nzbget uses bandwidth to create the impression of using less bandwidht and to make bandwidth available for other applications. ![]() limit_bw : a script that needs to be run by a cron job on the docker host.my init script also copies limit_bw to /config so that it can be executed by the docker host directly from the mounted volume while the container is runnig. my init simply writes adapter.id to /config and then calls the init2 script to start nzbget as per normal. init : renaming the stock init to "init2" in the image and adding this new file allows a simple script to run before nzbget starts.At some point I may make this image build from theirs so that any updates are easier to roll forward. This image is basically the same as the linuxserver image, it's a fork from the main development stream. This means the issues are likely inherent to the nzbget app and/or the par/unrar/7zip extensions.Note : This is a work in progress - I'm not sure I have everything working properly yet. Note that I've also found these same issues when using the Linuxserver.io build of the nzbget Docker container. Hope these findings might help others and maybe even help the nzbget team further refine their post-processing routines. The logs show the unpack request is calling 7zip but the unpack hangs for some reason that the logs don't identify. The 3rd case of failure I've found is the complete 'halt' of the extract/unpack process, which seems to be a bug on the way 7zip is called to process. When you look at the source folder (the 'intermediate' folder for most, depending on how you have nzbget configured) and delete all the 'bad' files that have been renamed and then do a re-postprocess, the unpack will usually succeed. The same can happen with rar archives - the filename might be and after the repair/rejoin there's a 2nd file named. xxxxxxxxxxxxxxxxx.7z.001 is repaired/rejoined but there is a copy of the bad file named xxxxxxxxxxxxxxxxx.7z.0001. For example, if nzbget PAR does a repair/rejoin, it sometimes seems to create a file with one more leading '0' in the filename, i.e. I find this confuses nzbget's unpack processing, especially when the first file in the archive set has a renamed copy. The other issue is some PAR repairs leave the renamed damaged files in the source folder. Once QuickPAR has re-joined these blocks/files, then nzbget can successfully unpack. If I run QuickPAR from Windows using the same PAR set, it often finds 1 or 2 files that have all blocks present but they need to be re-joined. Sometimes nzbget reports 'PAR Success' but no matter how many times I try and re-postprocess the download, the unpack fails or gets stuck. ![]() First is that the par check/repair stage seems to fail randomly. I've discovered a few issues that seem to be related. I and other users are seeing the same issue.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |