06:04:43 !a https://transfer.archivete.am/YaCtN/robots_files.txt 06:04:49 datechnoman: Deduplicating and queuing 987 items. (for 'https://transfer.archivete.am/YaCtN/robots_files.txt') 06:04:51 datechnoman: Deduplicated and queued 987 items. (for 'https://transfer.archivete.am/YaCtN/robots_files.txt') 06:52:01 !a https://transfer.archivete.am/148LAa/robots_files_batch2.txt 06:52:04 datechnoman: Skipped 1 invalid URLs: https://transfer.archivete.am/JCOg9/robots_files_batch2.txt.bad-urls.txt (for 'https://transfer.archivete.am/148LAa/robots_files_batch2.txt') 06:52:05 datechnoman: Deduplicating and queuing 9000 items. (for 'https://transfer.archivete.am/148LAa/robots_files_batch2.txt') 06:52:06 datechnoman: Deduplicated and queued 9000 items. (for 'https://transfer.archivete.am/148LAa/robots_files_batch2.txt') 08:40:51 !a https://transfer.archivete.am/dVeTh/robots_files_batch3.txt 08:40:53 datechnoman: Deduplicating and queuing 15211 items. (for 'https://transfer.archivete.am/dVeTh/robots_files_batch3.txt') 08:40:54 datechnoman: Deduplicated and queued 15211 items. (for 'https://transfer.archivete.am/dVeTh/robots_files_batch3.txt') 10:01:32 We are back to 0 queue :D arkiver 10:01:50 Todo** 10:54:28 !a https://transfer.archivete.am/7aUgF/robots_files_batch4.txt 10:54:37 datechnoman: Skipped 5 invalid URLs: https://transfer.archivete.am/OmP6M/robots_files_batch4.txt.bad-urls.txt (for 'https://transfer.archivete.am/7aUgF/robots_files_batch4.txt') 10:54:38 datechnoman: Deduplicating and queuing 199995 items. (for 'https://transfer.archivete.am/7aUgF/robots_files_batch4.txt') 10:54:46 datechnoman: Deduplicated and queued 199995 items. (for 'https://transfer.archivete.am/7aUgF/robots_files_batch4.txt') 16:44:02 url backfeed channel looks unhappy, seeing "Failed to submit discovered URLs.timeoutnil" across a few projects