Memory Leak again

Post any problems / bugs / issues that are Mylar-related in here.
leaderdog
Posts: 377
Joined: Sun Apr 26, 2015 1:52 pm

Memory Leak again

Post by leaderdog »

Hi Evilhero,

I've been watching my RAM usage get up to 1.7GB again. That's more than my Video Security Camera software. It takes about a day to get to that level. I just hit carepackage so you can take a look. I don't think the update you pushed today address it, otherwise I'd have waited another day or so to send this along.

Mylar Version: python3-dev
-- git (build 77bf272b0d7ccf94ef927ebd0c2c52cfcc5fe467)
Python Version : 3.7.7
Windows 10

Here's the carepackage:
https://www.dropbox.com/s/74j6v2n5shuir ... e.zip?dl=0

I think the previous update was pushed yesterday, I believe that's the last time I rebooted Mylar.

If the difference is between rss working correctly and High Mem usage, I'll deal with it ;) I can always reboot it daily to compensate. But would be nice if it wasn't chewing up so much RAM. It really slows down manual post processing and adding new titles.

Thanks for any assistance.
User avatar
evilhero
Site Admin
Posts: 2883
Joined: Sat Apr 20, 2013 3:43 pm
Contact:

Re: Memory Leak again

Post by evilhero »

Nothing has changed as far as changing any of the items that would relate to a search queue blowing up or fluctuating to a high-memory level, so atm at least, not quite sure what would be causing that.

Your RSS is taking ~19-23 minutes to complete a scan, which fires off every 60mins - Mylar logs the amount of time per scan as well:

Code: Select all

[RSS-FEEDS] RSS dbsearch/matching took: 0:23:35.513179
I've been working on an RSS improvement that still needs some additional tweaks and fixes before it can be merged into a branch, but it's coming along quite nicely. If you compare the time above with what it does using your db that you included in your carepackage (using the same providers obviously):

Code: Select all

[RSS-QUERY] Searched through RSSDB looking for 380 Wanted items in 25468 RSS entries. Rough matching to 0 items.
[RSS-FEEDS] RSS dbsearch/matching took: 0:00:04.460595
So, yeah it's a bit of an improvement over the existing method. It does obviously change depending on the number of Wanted items, etc - so if it actually rough matched to a few items in your RSS, it would increase the time aspect, but I've tested it against an rssdb of over 100,000 entries and it clocks in between 5s - 30s with several rough matches, as opposed to upwards of at least 40-60mins (with several items being roughly matched therein).

The only downside to the new method is that it has to do a one-time conversion of your existing rssdb, which if it's a large dataset can upwards of a normal RSS scan time during startup with no indication to the user since it's all in the background - something I'll need to address somehow prior as it literally won't load up the GUI while it's running since it's on startup.
leaderdog
Posts: 377
Joined: Sun Apr 26, 2015 1:52 pm

Re: Memory Leak again

Post by leaderdog »

That does look like a nice improvement.

I'm not sure when the high Memory started, it was 2 or so weeks ago I believe.

Right now it's sitting at 657.7MB and climbing. I haven't done anything with Mylar today other than update it.

When I was manually adding and scanning in books (the import doesn't work well for me) it was taking so long just to post process, that's when I realized the memory usage was high. After a reboot it was snappy and responsive again.

I'll just keep rebooting for now, or should I delete all the logs and let it do it's thing and send another carepackage when it gets over a gig. maybe something will be more obvious, or do you think it's the RSS that's collecting memory?

I'll delete my logs anyways. ;)
leaderdog
Posts: 377
Joined: Sun Apr 26, 2015 1:52 pm

Re: Memory Leak again

Post by leaderdog »

Hi Evilhero,

It's up to a Gig or RAM usage again. here's the new carepackage. I deleted everything in the logs and restarted Mylar yesterday. It did it's job for what was available via usenet and CV. Maybe something will stand out as to why Mylar is collecting RAM.

IS this just me again? or are you or anyone else experiencing high RAM usage? Or is it Windows 10 issue? I'm on Win10 Ver: 20H2 Build 19042.746.

Here's the link:
https://www.dropbox.com/s/mrtedrrb3lfgk ... e.zip?dl=0
leaderdog
Posts: 377
Joined: Sun Apr 26, 2015 1:52 pm

Re: Memory Leak again

Post by leaderdog »

I noticed by accident this morning that I had over 100 files sitting in a loading state in Manage Comics.

Maybe that's why the Mem usage is so high?

I've no idea why any of them would have been stuck? I selected them all and did a refresh, so we'll see if that has any affect on how much RAM gets used.
leaderdog
Posts: 377
Joined: Sun Apr 26, 2015 1:52 pm

Re: Memory Leak again

Post by leaderdog »

Ok, looks like none of them are refreshing. And something threw an error that threw another error. So, I'm passing this on to you. :)

Carepackage:
https://www.dropbox.com/s/maxkhe3jh9v42 ... e.zip?dl=0
leaderdog
Posts: 377
Joined: Sun Apr 26, 2015 1:52 pm

Re: Memory Leak again

Post by leaderdog »

update:

I've killed Mylar for now. It just keeps adding more titles to get stuck refreshing. My first report was slightly off, it was just onto the 4th page of 25, now it's on the 5th page of 25.

Hopefully something stands out. normally I could find an offending title that just breaks the program, but I didn't notice that in the logs after a reboot. so I don't understand why it's hanging on the refresh, and as far as I know, there's no way to disable that is there? I still kinda wish the older series would refresh once a year or quarter year, they don't change often. These are all api hits on comicvine from my understanding? So with a huge library it's just a constant round of hits to comicvine.
zeus163
Posts: 15
Joined: Mon Feb 01, 2021 5:18 am

Re: Memory Leak again

Post by zeus163 »

I'm having, I believe, the same issues as well. It started yesterday when I was adding either something to my "wanted" list or while working on cataloging 6 years of 0-days/series.

When I do a search in Mylar, nothing shows up. I think that is the import process, but adding a book using the ID found on CV just seems to be stuck in a loop of (Comic information is currently being loaded) and nothing shows up. I've restarted, shut down, and even restarted my computer.

I'm only replying here because I think what I'm experiencing is the same or similar to leaderdog.
User avatar
evilhero
Site Admin
Posts: 2883
Joined: Sat Apr 20, 2013 3:43 pm
Contact:

Re: Memory Leak again

Post by evilhero »

Comicvine is having problems at the moment, since the early parts of today. The API isn't responding, so you really can't do anything until they fix it. You'll get errors in weird places, searches not happening etc, until such time as they get it back up and running.
leaderdog
Posts: 377
Joined: Sun Apr 26, 2015 1:52 pm

Re: Memory Leak again

Post by leaderdog »

Aw yes, good old comicvine. ;) That was the problem for it having 5 pages of comics in a "loading" pattern.

But I guess that should be classified as a bug or broken feature?

I'm not sure if this is possible, but if it fails to do an API because the API is dead, shouldn't Mylar halt refresh operation for say 5 minutes then retry the one it failed on? I'm assuming that if CV changed a book so it no longer existed, it would throw a different error than the API just not working? If that is the case, it should temporarily stop refreshing books to avoid this situation.

It just seems counterproductive to have pages upon pages of titles sitting in refresh.

And what ever is causing the memory leak is still there, it's at 1.1GB from overnight. I had shut mylar down to prevent the refresh list getting bigger until CV fixed their API issue.

This memory leak isn't something you're seeing with your windows 10 installation?
Or anyone for that matter? Could it be related to my version of Python? Python Version : 3.7.7
or possibly a requirement needs updating?

I don't understand how no one else is having this problem but me. It can't be my hardware as nothing else is running amuck?

Or is it my largeish collection?

Bragging Rights
# of Series you're watching: 7172
# of Series you're watching that are continuing: 350
# of Issues you're watching: 48724
# of Issues you actually have: 48301
... total HD-space being used: 1.96 TB
Post Reply