New issues not downloading automatically

Post any problems / bugs / issues that are Mylar-related in here.
leaderdog
Posts: 377
Joined: Sun Apr 26, 2015 1:52 pm

Re: New issues not downloading automatically

Post by leaderdog »

Restarted, no change.

Still downloads great when you hit manual search. But won't download anything off the new releases by itself. Is it too late now? as in a couple days past it won't search through the new releases until it's finished with the huuuuge wanted list?
User avatar
evilhero
Site Admin
Posts: 2887
Joined: Sat Apr 20, 2013 3:43 pm
Contact:

Re: New issues not downloading automatically

Post by evilhero »

It might be because your rss cache / feeds don't hold the issues anymore, so polling against it isn't going to change anything (since you already grabbed the ones you didn't have I'm assuming).

You would have to try and re-download one of the issues you already grabbed to see if the send-to-sab is indeed working correctly.

I'm concurrently running against sab and torrents, and both are working for me - so atm I hate to say it's a localized issue, but it appears to be - is it possible you're starting Mylar on the wrong IP interface since you have 2 due to 2 IPs for your server ? I can see then why you'd need to use the host_return field, to ensure that when it sends the request back that it hits the correct IP, although it shouldn't matter if you have it set to 0.0.0.0, you would be able to hit it regardless of the network.
leaderdog
Posts: 377
Joined: Sun Apr 26, 2015 1:52 pm

Re: New issues not downloading automatically

Post by leaderdog »

I haven't mass downloaded the files yet via newsleecher. Was just playing with Mylar to see if it would work.

If the rss is out of retention now, then they won't do it automatically?

when you say send-to-sab may not work correctly? what do you mean? If I do the manual search it finds the file, sends to sab, downloads it, and then renames and moves it to the correct location. Everything seems to be working correctly on that front.

I've tried several files that I know are on usegroups and they've all downloaded with manual search. Mylar just isn't triggering automatically.

It probably is something locally, how do I know which ip mylar loaded up under? (I had something like this before, but we resolved that by putting 0.0.0.0 in mylar host. That is still the same, I also checked the config.ini to make sure it was there as well.

Mighty peculiar this ;) I just updated to the latest build as well. Still behaves the same.
User avatar
evilhero
Site Admin
Posts: 2887
Joined: Sat Apr 20, 2013 3:43 pm
Contact:

Re: New issues not downloading automatically

Post by evilhero »

Well the problem might not be the automated firing, so much as the feeds you're trying to grab don't have the given issues.

When you do a manual search, mylar first fires off an Rss request to the local dB to see if the issue had been previously located in a past Rss feed. If it can't find a match, it then tries to do not api search against the given indexer to locate the issue. From what I'm gathering, the Rss isn't matching, but you're getting correct matches on the api.

The Rss fires off every 20 mins by default, unless it's been changed. It adds all the Rss feed data to Rss local dB so that when it gets queried next it's up to date.

Do you have Rss enabled on the provider tab? And is it set to 20 minutes?

By default, unless it's been changed, mylar will do a force search against your current wanted list every 6 hrs (360 minutes in the Gui). This force search will perform an Rss check and then an api check for the issues.
leaderdog
Posts: 377
Joined: Sun Apr 26, 2015 1:52 pm

Re: New issues not downloading automatically

Post by leaderdog »

Hi Evilhero,

Yep rss is enabled and left at the 20mins.

Should I delete the rss database (or is that even possible or helpful)?

Here's the config.ini in case something looks amiss to you.

Code: Select all

[General]
config_version = 6
dbchoice = sqlite3
dbuser = ""
dbpass = ""
dbname = ""
dynamic_update = 4
comicvine_api = ************************************
cvapi_rate = 2
http_port = 8090
http_host = 0.0.0.0
http_username = ""
http_password = ""
http_root = /
enable_https = 0
https_cert = ""
https_key = ""
https_chain = ""
https_force_on = 0
host_return = ""
api_enabled = 0
api_key = ""
launch_browser = 1
auto_update = 0
log_dir = C:\Mylar\logs
max_logsize = 100000000
git_path = ""
cache_dir = C:\Mylar\cache/
annuals_on = 1
cv_only = 1
cv_onetimer = 1
check_github = 1
check_github_on_startup = 1
check_github_interval = 360
git_user = evilhero
git_branch = development
destination_dir = H:\
multiple_dest_dirs = ""
create_folders = 1
delete_remove_dir = 0
enforce_perms = 1
chmod_dir = 0777
chmod_file = 0660
chowner = None
chgroup = None
usenet_retention = 1500
alt_pull = 2
search_interval = 360
nzb_startup_search = 0
add_comics = 0
comic_dir = ""
blacklisted_publishers = None
imp_move = 0
imp_rename = 0
imp_metadata = 0
enable_check_folder = 1
download_scan_interval = 5
folder_scan_log_verbose = 1
check_folder = L:\Complete
interface = default
dupeconstraint = filesize
ddump = 1
duplicate_dump = L:\_comic dups
pull_refresh = 2017-01-06 13:27:00
autowant_all = 1
autowant_upcoming = 1
preferred_quality = 0
comic_cover_local = 0
correct_metadata = 0
move_files = 0
rename_files = 1
folder_format = $Publisher\$Series ($Year)
file_format = $Series $VolumeN - $Issue
blackhole_dir = ""
replace_spaces = 0
replace_char = .
zero_level = 1
zero_level_n = 00x
lowercase_filenames = 0
ignore_havetotal = 0
snatched_havetotal = 0
syno_fix = 1
allow_packs = 0
search_delay = 1
grabbag_dir = H:\
highcount = 0
read2filename = 0
send2read = 0
maintainseriesfolder = 0
tab_enable = 0
tab_host = ""
tab_user = ""
tab_pass = ""
tab_directory = ""
storyarcdir = 0
copy2arcdir = 0
arc_folderformat = $arc ($spanyears)
arc_fileops = copy
use_minsize = 0
minsize = ""
use_maxsize = 0
maxsize = ""
add_to_csv = 1
cvinfo = 0
log_level = 0
enable_extra_scripts = 0
extra_scripts = ""
enable_pre_scripts = 0
pre_scripts = ""
post_processing = 1
post_processing_script = ""
file_opts = move
weekfolder = 0
weekfolder_loc = ""
weekfolder_format = 0
locmove = 0
newcom_dir = ""
fftonewcom_dir = 0
enable_meta = 0
cbr2cbz_only = 0
ct_tag_cr = 1
ct_tag_cbl = 1
ct_cbz_overwrite = 0
unrar_cmd = None
cmtag_start_year_as_volume = 0
update_ended = 0
indie_pub = 75
biggie_pub = 55
upcoming_snatched = 1
enable_rss = 1
rss_checkinterval = 20
rss_lastrun = 2017-01-06 13:48:04
failed_download_handling = 1
failed_auto = 1
provider_order = 0, nzbfinder.ws, 1, www.usenet-crawler.com
nzb_downloader = 0
torrent_downloader = 0
[Torrents]
enable_torrents = 0
minseeds = 0
torrent_local = 0
local_watchdir = ""
torrent_seedbox = 0
seedbox_host = ""
seedbox_port = ""
seedbox_user = ""
seedbox_pass = ""
seedbox_watchdir = ""
enable_torrent_search = 0
enable_tpse = 0
tpse_proxy = ""
tpse_verify = True
enable_32p = 0
search_32p = 0
mode_32p = 0
passkey_32p = ""
rssfeed_32p = ""
username_32p = ""
password_32p = ""
verify_32p = 1
snatchedtorrent_notify = 0
rtorrent_host = ""
rtorrent_authentication = basic
rtorrent_rpc_url = ""
rtorrent_ssl = 0
rtorrent_verify = 0
rtorrent_ca_bundle = ""
rtorrent_username = ""
rtorrent_password = ""
rtorrent_startonload = 0
rtorrent_label = ""
rtorrent_directory = ""
[SABnzbd]
sab_host = http://localhost:8080
sab_username = ""
sab_password = ""
sab_apikey = d***********************8
sab_category = comics
sab_priority = Default
sab_to_mylar = 0
sab_directory = ""
[NZBGet]
nzbget_host = ""
nzbget_port = ""
nzbget_username = ""
nzbget_password = ""
nzbget_category = ""
nzbget_priority = Default
nzbget_directory = ""
[NZBsu]
nzbsu = 0
nzbsu_uid = ""
nzbsu_apikey = ""
nzbsu_verify = True
[DOGnzb]
dognzb = 0
dognzb_apikey = ""
dognzb_verify = True
[Experimental]
experimental = 0
altexperimental = 1
[Torznab]
enable_torznab = 0
torznab_name = ""
torznab_host = ""
torznab_apikey = ""
torznab_category = ""
torznab_verify = False
[Newznab]
newznab = 1
extra_newznabs = nzbfinder.ws, https://nzbfinder.ws, 1,*************************************, 4***1, 1, www.usenet-crawler.com, https://www.usenet-crawler.com, 1, ****************************************, 1****3, 1
[uTorrent]
utorrent_host = ""
utorrent_username = ""
utorrent_password = ""
utorrent_label = ""
[Transmission]
transmission_host = None
transmission_username = None
transmission_password = None
transmission_directory = ""
[Deluge]
deluge_host = ""
deluge_username = ""
deluge_password = ""
deluge_label = ""
[Prowl]
prowl_enabled = 0
prowl_keys = ""
prowl_onsnatch = 0
prowl_priority = 0
[NMA]
nma_enabled = 0
nma_apikey = ""
nma_priority = 0
nma_onsnatch = 0
[PUSHOVER]
pushover_enabled = 0
pushover_apikey = ""
pushover_userkey = ""
pushover_priority = 0
pushover_onsnatch = 0
[BOXCAR]
boxcar_enabled = 0
boxcar_onsnatch = 0
boxcar_token = ""
[PUSHBULLET]
pushbullet_enabled = 0
pushbullet_apikey = ""
pushbullet_deviceid = None
pushbullet_onsnatch = 0
[TELEGRAM]
telegram_enabled = 0
telegram_token = ""
telegram_userid = ""
telegram_onsnatch = 0
leaderdog
Posts: 377
Joined: Sun Apr 26, 2015 1:52 pm

Re: New issues not downloading automatically

Post by leaderdog »

ok, I cleared mylar logs, and did another manual search. Here's the results (it did start download almost immediately)

Code: Select all

06-Jan-2017 14:20:30 - INFO    :: Thread-13 : Initiating manual search for Avengers issue: 3
06-Jan-2017 14:20:30 - INFO    :: Thread-13 : Issue Title given as : None
06-Jan-2017 14:20:30 - INFO    :: Thread-13 : search provider order is ['newznab:nzbfinder.ws', 'newznab:www.usenet-crawler.com']
06-Jan-2017 14:20:30 - INFO    :: Thread-13 : allow_packs set to :None
06-Jan-2017 14:20:30 - INFO    :: Thread-13 : Shhh be very quiet...I'm looking for Avengers issue: 3 (2017) using nzbfinder.ws(newznab) [RSS]
06-Jan-2017 14:20:30 - INFO    :: Thread-13 : newznab
06-Jan-2017 14:20:30 - INFO    :: Thread-13 : rss:yes
06-Jan-2017 14:20:30 - INFO    :: Thread-13 : allow_packs:None
06-Jan-2017 14:20:30 - INFO    :: Thread-13 : Found Avengers (2017) issue: 3 using nzbfinder.ws(newznab) [RSS]
06-Jan-2017 14:20:30 - INFO    :: Thread-13 : prov  : nzbfinder.ws(newznab) [RSS][******************************8]
06-Jan-2017 14:20:30 - INFO    :: Thread-13 : [FAILED_DOWNLOAD_CHECKER] Successfully marked this download as Good for downloadable content
06-Jan-2017 14:20:30 - INFO    :: Thread-13 : Download URL: https://nzbfinder.ws/api?apikey=e******************************&t=get&id=7****************************************8 [VerifySSL:True]
06-Jan-2017 14:20:35 - INFO    :: Thread-13 : filen: _Avengers_003_.2017._.Digital._.Zone-Empire -- nzbname: Avengers.003.2017.Digital.Zone-Empire are not identical. Storing extra value as : .Avengers.003..2017...Digital...Zone-Empire
06-Jan-2017 14:20:36 - INFO    :: Thread-13 : Successfully sent nzb file to SABnzbd
06-Jan-2017 14:20:36 - INFO    :: Thread-13 : setting the nzbid for this download grabbed by nzbfinder.ws(newznab) in the nzblog to : ****************************************8
06-Jan-2017 14:20:36 - INFO    :: Thread-13 : setting the alternate nzbname for this download grabbed by nzbfinder.ws(newznab) in the nzblog to : .Avengers.003..2017...Digital...Zone-Empire
06-Jan-2017 14:20:36 - INFO    :: Thread-13 : passing to updater.
06-Jan-2017 14:20:36 - INFO    :: Thread-13 : [UPDATER] Updating status to snatched
06-Jan-2017 14:20:36 - INFO    :: Thread-13 : [UPDATER] Updated the status (Snatched) complete for Avengers Issue: 3
06-Jan-2017 14:21:32 - INFO    :: CP Server Thread-2 : ComicRN.py version: 1.01 -- autoProcessComics.py version: 1.0
06-Jan-2017 14:21:32 - INFO    :: CP Server Thread-2 : Starting postprocessing for : Avengers.003.2017.Digital.Zone-Empire
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : [POST-PROCESSING] issuenzb found.
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : [DUPECHECK] Duplicate check for L:\Complete\Avengers.003.2017.Digital.Zone-Empire
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : [DUPECHECK] Duplication detection returned no hits. This is not a duplicate of anything that I have scanned in as of yet.
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : [POST-PROCESSING] [1/1] Starting Post-Processing for Avengers issue: 3
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : [POST-PROCESSING][DIRECTORY-CHECK] Found comic directory: H:\Marvel\Avengers (2016)
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : [POST-PROCESSING][UPDATER] Setting status to Downloaded in history.
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : [POST-PROCESSING][UPDATER] Updating Status (Downloaded) now complete for Avengers issue: 3
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : [POST-PROCESSING][FILE-RESCAN] Now checking files for Avengers (2016) in H:\Marvel\Avengers (2016)
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : there are 5 files.
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : [FILENAME]: Avengers v6 - 001.1.cbr
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : Series_Name: Avengers --- WatchComic: Avengers
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : [FILENAME]: Avengers v6 - 001.cbr
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : Series_Name: Avengers --- WatchComic: Avengers
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : [FILENAME]: Avengers v6 - 002.1.cbr
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : Series_Name: Avengers --- WatchComic: Avengers
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : [FILENAME]: Avengers v6 - 002.cbr
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : Series_Name: Avengers --- WatchComic: Avengers
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : [FILENAME]: Avengers v6 - 003.cbr
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : Series_Name: Avengers --- WatchComic: Avengers
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : [POST-PROCESSING][FILE-RESCAN] Total files located: 5
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : [POST-PROCESSING][FILE-RESCAN] I have physically found 5 issues, ignored 0 issues, snatched 0 issues, and accounted for 0 in an Archived state [ Total Issue Count: 5 / 5 ]
06-Jan-2017 14:21:32 - INFO    :: Post-Processing : [POST-PROCESSING] Post-Processing completed for: Avengers issue: 3
06-Jan-2017 14:22:39 - INFO    :: FOLDERMONITOR : delaying thread for 60 seconds to avoid locks.

Downloaded fine, and moved over to it's correct folder.
User avatar
evilhero
Site Admin
Posts: 2887
Joined: Sat Apr 20, 2013 3:43 pm
Contact:

Re: New issues not downloading automatically

Post by evilhero »

Ok, so it's polling against the rss fine - but those are issues that you've already grabbed as well. So it's abit difficult atm to figure out if the backlog automated search is stalling due to some other reason (perhaps it's encountering an error it can't recover from).

There's no point in wiping out the rss db as it seems there's nothing wrong with the entries itself as it's picking them up just fine from a normal search - which does the rss query initially and it's getting a search match on that according to the logs.

How many issues do you have on your wanted list ?

I had thought that Mylar takes the most recent issues and when it does it's automated search it starts with those and works from there - but then it would be picking up all the required issues. If you don't have that many issues on your wanted list, you can try to do a Force Search from the Wanted tab and seeing if that picks up any issues, albeit it might not due to most of the issues you probably already grabbed today / yesterday at some point.
leaderdog
Posts: 377
Joined: Sun Apr 26, 2015 1:52 pm

Re: New issues not downloading automatically

Post by leaderdog »

ouch I have 959 items on the wanted list. Lots of unfindables. Maybe I should just delele a lot of those series.

Ya I thought the upcoming issues were to be targeted first then rest of the wanted list.

This is a real head scratcher. I'm guessing it happened after one of the updates. It downloaded some new episodes as of Jan 04 at 7:17pm After that I've had to manually engage the downloads. Mind you I've only done that 10 times. I haven't grabbed any other of the new releases.

Should I reinstall Mylar? ugh, don't want to lose settings or locations etc.
or would installing the latest python version make any difference?

I just decided to go through the series and download the files manually. I noticed a couple hadn't updated yet but the issues have been released. When I refresh the series, it used to automatically grab the file that was missing if it was available (if I recall correctly) but I still had to hit manual download to activate it.
leaderdog
Posts: 377
Joined: Sun Apr 26, 2015 1:52 pm

Re: New issues not downloading automatically

Post by leaderdog »

hey hey!! I think I figured it out.

when I was going through the series to add them, I noticed that Doctor Strange/The punisher Magic bullet was trapped in an endless refresh cycle. I deleted the comic and then added a new series that I didn't have and was available in the usegroups. It downloaded the file automatically after Mylar added the comic series.

Hunted through newsgroup found some other series that I didn't have. Added those and they all downloaded on their own. So maybe it was just that one stuck series causing the chaos?

Will have to wait until next week to find out for sure I suppose.

Thanks for all your help!!
TheBigZig
Posts: 3
Joined: Sat Feb 18, 2017 1:21 pm

Re: New issues not downloading automatically

Post by TheBigZig »

I'm seeing the same refresh loop on Doctor Strange 2015. I've tried to delete the comic and re-add, but it keep looping. Any help would be appreciated.
Post Reply