Nothing happening

Post any problems / bugs / issues that are Mylar-related in here.
Post Reply
maglorsmith
Posts: 3
Joined: Sat Nov 21, 2020 11:54 am

Nothing happening

Post by maglorsmith »

Hello,

I am using version 0.4.9 (master) in a docker container provided by linuxserver.io. I was originally able to setup and download around 20 issues of a series, however i had to restart a few times to get it to pull the next issue. Sometimes it would pull 1 and then need a restart other times it would do 5 then need a restart. I currently have issues 1-19 and 21, 22 of a series. i have confirmed that issue 20 and 23 are available at nzb.su and have confirmed that the api tests to both nzb and sab are working.

the log has the following message every 5 minutes

Code: Select all

WARNING :: mylar.Process.423 : ThreadPoolExecutor-0_0 : There were no files located - check the debugging logs if you think this is in error.
From a quick google this seems to be an old error in relation to permissions, however, all the permissions seem fine as far as i can tell. Aswell as the fact that it did sort of work for a little while is making me think its not a permission fault (happy to be wrong).

i also have specific_1 to specific_53 log files that just seem to be repeating

Code: Select all

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/app/mylar3/mylar/search.py", line 3093, in searchforissue
    foundNZB, prov = search_init(
  File "/app/mylar3/mylar/search.py", line 515, in search_init
    findit = NZB_SEARCH(
  File "/app/mylar3/mylar/search.py", line 1254, in NZB_SEARCH
    bb = findcomicfeed.Startit(
  File "/app/mylar3/mylar/findcomicfeed.py", line 101, in Startit
    feeds.append(feedparser.parse(mylar.EXPURL + "search/rss?q=%s&max=50&minage=0%s&hidespam=1&hidepassword=1&sort=agedesc%s&complete=0&hidecross=0&hasNFO=0&poster=&g[]=85" % (joinSearch, max_age, size_constraints)))
  File "/usr/lib/python3.8/site-packages/feedparser/api.py", line 214, in parse
    data = _open_resource(url_file_stream_or_string, etag, modified, agent, referrer, handlers, request_headers, result)
  File "/usr/lib/python3.8/site-packages/feedparser/api.py", line 114, in _open_resource
    return http.get(url_file_stream_or_string, etag, modified, agent, referrer, handlers, request_headers, result)
  File "/usr/lib/python3.8/site-packages/feedparser/http.py", line 158, in get
    f = opener.open(request)
  File "/usr/lib/python3.8/urllib/request.py", line 525, in open
    response = self._open(req, data)
  File "/usr/lib/python3.8/urllib/request.py", line 542, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.8/urllib/request.py", line 1393, in https_open
    return self.do_open(http.client.HTTPSConnection, req,
  File "/usr/lib/python3.8/urllib/request.py", line 1353, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno -3] Try again>
Traceback (most recent call last):
  File "/usr/lib/python3.8/urllib/request.py", line 1350, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.8/http/client.py", line 1255, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1301, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1250, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1010, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 950, in send
    self.connect()
  File "/usr/lib/python3.8/http/client.py", line 1417, in connect
    super().connect()
  File "/usr/lib/python3.8/http/client.py", line 921, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.8/socket.py", line 787, in create_connection
    for res in getaddrinfo(host, port, 0, SOCK_STREAM):
  File "/usr/lib/python3.8/socket.py", line 918, in getaddrinfo
    for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
socket.gaierror: [Errno -3] Try again
These logs seem to have stopped around 24 hours ago after running for about 18 hours.

The Mylar log is basically just those 2 things for around the last 15 days other than the initial configuration and a few connection errors to api.nzb.su

if you need anything else please let me know but hopefully there is enough info there for you to point me in the right direction.

Thanks for your time
Cheers
User avatar
evilhero
Site Admin
Posts: 2883
Joined: Sat Apr 20, 2013 3:43 pm
Contact:

Re: Nothing happening

Post by evilhero »

The traceback errors in the specific logs you posted pertain to the experimental search option. Which should be working fine, but if you're having problems with it, you might want to disable it for the time being.

The no files located message is your Folder Monitor firing off every 5 minutes checking for new files. It's normal, so you can ignore that aspect.

Let's try and figure out what's happening, but first we need to change some settings:

Ok, so first restart Mylar.

Put Mylar into debug/verbose mode - go into the History tab / View Logs / Turn Verbose Logging ON. You'll get a quick blurb in the logs that verbose mode has been enabled.

Once you have proper logging enabled, try to do a few searches.

Once you have a few searches and hopefully some that didn't complete for w/e reason - go into the Configuration section of Mylar and click on the CarePackage button in the upper right. It will generate a zip file containing your mylar.db, cleaned logs, and a cleaned config.ini and download it in your browser window.

Then just find a place where you can upload the carepackage (mega, dropbox, filelocker, w/e) and paste the link here so I can check it out.
maglorsmith
Posts: 3
Joined: Sat Nov 21, 2020 11:54 am

Re: Nothing happening

Post by maglorsmith »

Hi evilhero,

Thanks for getting back to me. I have disabled the experimental search feature as per your suggestion and ran a search for 14 episodes from 2 different series. I have been able to find some of these manually when searching so i believe they should pull fine, however i have not checked every release on the list.

Care Package

Above is the care package link for your viewing pleasure.

Thanks again for your time and if you're in the states i hope you have a good thanksgiving. If you're not in the states then i hope you have a good rest of the week.

Cheers
Mag
User avatar
evilhero
Site Admin
Posts: 2883
Joined: Sat Apr 20, 2013 3:43 pm
Contact:

Re: Nothing happening

Post by evilhero »

Sorry for the delay in response, the BF sales kinda took me out of things...

Checked out the logs, didn't notice anything problematic. But then I saw what you were searching for - you have your Usenet Retention set to low within Mylar. You have it set to 1500 days, whereas the items you were searching for are around the 2600 day mark - so because of that Mylar's not getting the results back that it needs.

So yeah, change the retention to something like 3500, and hopefully your usenet provider has the retention to download them fully.

After doing so, restart Mylar so that you can come at it from a clean perspective and everything will be fresh (queues won't be locked, error'd possibly, etc).

Also, it looks like you're having problems getting the pull from the weekly pull site. When you go to the Weekly page, does it throw any errors, and/or if you do a Recreate Pull-list from that page, does it work ? Does it update the last updated time at the bottom to the current time ?

Thanks for the well wishes. I'm not in the States, so my weekend consisted of scouting online deals to get a jumpstart on Christmas shopping ;)

Cheers,
maglorsmith
Posts: 3
Joined: Sat Nov 21, 2020 11:54 am

Re: Nothing happening

Post by maglorsmith »

All good I'm in the same boat, hectic week with the kids finishing school this week. So we've been doing xmas cards for the classes, and presents etc for teachers.

I didn't even see that retention setting, I've adjusted that and some of the old comics are now coming in which is great. I'm just not sure about that 1 issue that was released October this year that keeps being missed.

Also i use nzbhydra2 for all my indexers is there support for this somewhere? using just nzb.su is working but for older stuff i would like to be able to search in a variety of locations to see if it can be found. Or would it be better to use torrents?

Not seeing any errors on the weekly page and i did the re-create and the time updated down the bottom. I'll see what happens when i start adding more stuff in now that I can get the downloads happening.

Do you have any recommendations for a viewer that works well? preferably offline /android compatible.

Thanks again for your time, really appreciate it. Hope you got some good specials :-)

Cheers
User avatar
evilhero
Site Admin
Posts: 2883
Joined: Sat Apr 20, 2013 3:43 pm
Contact:

Re: Nothing happening

Post by evilhero »

Late reply, but here's the reply at least :

- You can use nzbhydra2 by adding it as a newznab indexer within Mylar. You just need to make sure that you add the [nzbhydra] flare to the end of the Newznab Name field ( ie. NZBHydra2 [nzbhydra] ). Without the [nzbhydra] flare being added to the Newznab Name (not the Newznab Host field), Mylar might not get the responses back correctly.

As far as viewers, there are a few but it depends on how you're using it :
server-side (runs on a server, and you connect in with a client)
- Codex
- Komga
- Ubooquity
- Gazee (no longer maintained) :(

Mylar has a small ported version of Gazee (taken with permission and much love to hubbcaps & barbequesauce) cooked into Mylar. It's a very basic reader though, and is abit glitchy due to how we have Mylar using it - but it might work for some users with a basic need and don't want to install a separate application.

I can't speak for just the clients themselves unfortunately as I don't really use any clients - honestly, I don't even have time to read comics anymore even, and haven't for years. Ironic, I know. But there might be someone else who can chime in if they see this thread.
Post Reply