Mylar CPU usage

Post any problems / bugs / issues that are Mylar-related in here.
Post Reply
Aimhere
Posts: 98
Joined: Mon Apr 06, 2015 2:32 pm

Mylar CPU usage

Post by Aimhere »

Hi,

I'm running Mylar on a Synology Diskstation, and it seems like Mylar is continually using most of the CPU. My CPU usage for the Mylar process runs between 30 to 50%, often peaking as high as 60%. If I stop other running services (e.g. NZBGet, Sickrage), Mylar will go as high as 80% at times. Needless to say, this is really slowing everything else down.

I have 141 comics in my library, with 362 issues currently "wanted" (and none "snatched"). Just as an exercise, I marked all the series which have ended (and for which I have a complete set of issues) as "paused", but I don't think this was necessary (and it didn't make any difference anyway).

Is there anything I can do to reduce the CPU hit caused by Mylar? Or any tweaking I can do elsewhere to speed things up in general? I realize this Diskstation is underpowered when it comes to the CPU, but there must be something...

Mylar version is "72065be604bc167d56ce5cb28f902626f2f0f754 (development)", Diskstation OS is DSM 4.3.

Aimhere

P.S.:

Code: Select all

# of Series you're watching: 141
# of Issues you're watching: 3300
# of Issues you actually have: 2938
User avatar
evilhero
Site Admin
Posts: 2883
Joined: Sat Apr 20, 2013 3:43 pm
Contact:

Re: Mylar CPU usage

Post by evilhero »

TBH the cpu usage aspect is a bit outside of my wheelhouse as I haven't had any cpu issues when running mylar (and I have over 400 series being watched)

All I can suggest is to turn off the folder monitor (if you don't use it), and possibly the rss feeds as well. Although in doing the last part you'll be restricted to a 6hr search intervals and you might miss some issues due to stagnation.

I'm guessing what's happening is that you have multiple rss searches firing off while a background search is ongoing since you have 300+ wanted issues it has to go through something like:
(300 * (# of indexers) * issue variations (1,01,001) + (# of alternate search names) * (# of indexers)

This causes some really bad locking and might account for your cpu usage. My goal was to switch mylar to using rss feeds to constantly monitor and do away with the timed api searches, except when manually searched (like the way sonaar does it currently). Of course if you don't run mylar 24/7, then you're pretty much screwed. But I have an idea on how to resolve that portion but have yet to implement either method thus far as it requires a fair bit of backend code change and alot of it is very intertwined together.
Aimhere
Posts: 98
Joined: Mon Apr 06, 2015 2:32 pm

Re: Mylar CPU usage

Post by Aimhere »

Where in the settings is this "folder monitor" you speak of?

Would increasing the RSS Interval Feed Check setting (from 15 minutes to, say, 120) have any effect on the CPU usage?

Also, I currently have three Newznab providers configured, and have OMGWTFNZBs and Experimental Search selected as well. Is this too many? I could cut it back, if you think it would help.
User avatar
evilhero
Site Admin
Posts: 2883
Joined: Sat Apr 20, 2013 3:43 pm
Contact:

Re: Mylar CPU usage

Post by evilhero »

Aimhere wrote:Where in the settings is this "folder monitor" you speak of?
Image
Would increasing the RSS Interval Feed Check setting (from 15 minutes to, say, 120) have any effect on the CPU usage?
Doubtful - the RSS Checking wouldn't be what is doing the damage - that just grabs the rss feed from said site, chucks it into the db and then searches against the db for the given title. Increasing/Decreasing the setting would have a very minimal (if anything) effect.
Also, I currently have three Newznab providers configured, and have OMGWTFNZBs and Experimental Search selected as well. Is this too many? I could cut it back, if you think it would help.
I don't think that would help, although it would definitely lessen the amount of time it takes to do an api search (backlog) since it wouldn't have to cycle through so many providers. And honestly, Experimental and one good newznab provider is all you need since all the comics get indexed from the same 2 groups - it just depends on how said providers provide the results back via the api.

OMGWTFNZBs will be removed soon - I was under the impression that they indexed comics, and they don't, at all. So having it included is doing absolutely nothing but increasing the cycling time when searching for issues and possibly increasing the load.

If I were to suggest anything (which is just a suggestion), it would be to have Experimental and a good indexer (dog, nzbsu, nzbgeek, usenet-crawler, pfmonkey, oz). Of course it's up to you - but having multiple indexers which have exactly the same content (experimental indexes the raw headers, so it is abit different) seems abit wasteful to me. Unless of course you're limited by api hits to said indexers.
Aimhere
Posts: 98
Joined: Mon Apr 06, 2015 2:32 pm

Re: Mylar CPU usage

Post by Aimhere »

Thanks for pointing out the folder monitoring (turns out it was not even enabled, anyway). And I'll leave the RSS stuff alone.

As for the search providers, I will cut it back to just Experimental and Usenet-crawler. I didn't realize having more than one Newznab could be a bad thing. :?
Aimhere
Posts: 98
Joined: Mon Apr 06, 2015 2:32 pm

Re: Mylar CPU usage

Post by Aimhere »

Well, I've disabled all search providers except Usenet-crawler and Experimental, yet still Mylar's process seems to be using too much CPU. With Mylar enabled, my downloading in NzbGet (as seen in the Synology's data-transfer monitor) is very "spiky" with frequent severe dips in speed, but if I disable Mylar from running, the downloads are consistently faster (with far fewer dips).

I dunno, maybe it's too much to expect a Python-based application to work as quickly or efficiently as one made in a fully compiled language. Or maybe I'm simply asking too much of my poor Synology's low-end CPU... :(
capGrundy
Posts: 51
Joined: Thu Nov 09, 2017 12:25 am

Re: Mylar CPU usage

Post by capGrundy »

I know this is super-old, but in case someone else has this issue still, I found changing the download_scan_interval from 5 to 15 dropped the CPU down to nominal levels.
Post Reply