2017-06-18 06:41:55 INFO Log opened. 2017-06-18 06:41:55 INFO [scrapy.log] Scrapy 1.4.0 started 2017-06-18 06:41:56 INFO [scrapy.utils.log] Scrapy 1.4.0 started (bot: fashionsites) 2017-06-18 06:41:56 INFO [scrapy.utils.log] Overridden settings: {'AUTOTHROTTLE_ENABLED': True, 'AUTOTHROTTLE_START_DELAY': 2.0, 'BOT_NAME': 'fashionsites', 'LOG_ENABLED': False, 'LOG_LEVEL': 'INFO', 'MEMUSAGE_LIMIT_MB': 950, 'NEWSPIDER_MODULE': 'fashionsites.spiders', 'SPIDER_MODULES': ['fashionsites.spiders'], 'STATS_CLASS': 'sh_scrapy.stats.HubStorageStatsCollector', 'TELNETCONSOLE_HOST': '0.0.0.0', 'USER_AGENT': 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36'} 2017-06-18 06:41:56 INFO [scrapy.middleware] Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.logstats.LogStats', 'scrapy.extensions.spiderstate.SpiderState', 'scrapy.extensions.throttle.AutoThrottle', 'scrapy.extensions.debug.StackTraceDump', 'sh_scrapy.extension.HubstorageExtension'] 2017-06-18 06:41:56 INFO [scrapy.middleware] Enabled downloader middlewares: ['sh_scrapy.diskquota.DiskQuotaDownloaderMiddleware', 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats', 'sh_scrapy.middlewares.HubstorageDownloaderMiddleware'] 2017-06-18 06:41:56 INFO [scrapy.middleware] Enabled spider middlewares: ['sh_scrapy.diskquota.DiskQuotaSpiderMiddleware', 'sh_scrapy.middlewares.HubstorageSpiderMiddleware', 'scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2017-06-18 06:41:56 INFO [scrapy.middleware] Enabled item pipelines: [] 2017-06-18 06:41:56 INFO [scrapy.core.engine] Spider opened 2017-06-18 06:41:56 INFO [scrapy.extensions.logstats] Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2017-06-18 06:41:56 INFO TelnetConsole starting on 6023 2017-06-18 06:42:50 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:42:50 ERROR Unhandled error in Deferred: 2017-06-18 06:42:50 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:42:50 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:42:50 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:42:56 INFO [scrapy.extensions.logstats] Crawled 53 pages (at 53 pages/min), scraped 0 items (at 0 items/min) 2017-06-18 06:43:00 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:00 ERROR Unhandled error in Deferred: 2017-06-18 06:43:00 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:00 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:00 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:02 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:02 ERROR Unhandled error in Deferred: 2017-06-18 06:43:02 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:02 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:02 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:03 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:03 ERROR Unhandled error in Deferred: 2017-06-18 06:43:03 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:03 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:03 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:04 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:04 ERROR Unhandled error in Deferred: 2017-06-18 06:43:04 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:04 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:04 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:05 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:05 ERROR Unhandled error in Deferred: 2017-06-18 06:43:05 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:05 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:05 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:06 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:06 ERROR Unhandled error in Deferred: 2017-06-18 06:43:06 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:06 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:06 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:06 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:06 ERROR Unhandled error in Deferred: 2017-06-18 06:43:06 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:06 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:06 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:06 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:06 ERROR Unhandled error in Deferred: 2017-06-18 06:43:06 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:06 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:06 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:08 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:08 ERROR Unhandled error in Deferred: 2017-06-18 06:43:08 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:08 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:08 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:13 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:13 ERROR Unhandled error in Deferred: 2017-06-18 06:43:13 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:13 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:13 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:14 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:14 ERROR Unhandled error in Deferred: 2017-06-18 06:43:14 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:14 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:15 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:18 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:18 ERROR Unhandled error in Deferred: 2017-06-18 06:43:18 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:18 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:18 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:19 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:19 ERROR Unhandled error in Deferred: 2017-06-18 06:43:19 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:19 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:19 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:20 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:20 ERROR Unhandled error in Deferred: 2017-06-18 06:43:20 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:20 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:20 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:21 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:21 ERROR Unhandled error in Deferred: 2017-06-18 06:43:21 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:21 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:21 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:22 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:22 ERROR Unhandled error in Deferred: 2017-06-18 06:43:22 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:22 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:22 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:24 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:24 ERROR Unhandled error in Deferred: 2017-06-18 06:43:24 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:24 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:24 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:27 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:27 ERROR Unhandled error in Deferred: 2017-06-18 06:43:27 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:27 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:27 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:29 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:29 ERROR Unhandled error in Deferred: 2017-06-18 06:43:29 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:29 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:29 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:29 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:29 ERROR Unhandled error in Deferred: 2017-06-18 06:43:29 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:29 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:29 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:31 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:31 ERROR Unhandled error in Deferred: 2017-06-18 06:43:31 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:31 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:31 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:31 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:31 ERROR Unhandled error in Deferred: 2017-06-18 06:43:31 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:31 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:31 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:35 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:35 ERROR Unhandled error in Deferred: 2017-06-18 06:43:35 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:35 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:35 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:36 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:36 ERROR Unhandled error in Deferred: 2017-06-18 06:43:36 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:36 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:36 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:40 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:40 ERROR Unhandled error in Deferred: 2017-06-18 06:43:40 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:40 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:40 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:42 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:42 ERROR Unhandled error in Deferred: 2017-06-18 06:43:42 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:42 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:42 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:45 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:45 ERROR Unhandled error in Deferred: 2017-06-18 06:43:45 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:45 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:45 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:47 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:47 ERROR Unhandled error in Deferred: 2017-06-18 06:43:47 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:47 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:47 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:48 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:48 ERROR Unhandled error in Deferred: 2017-06-18 06:43:48 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:48 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:48 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:48 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:48 ERROR Unhandled error in Deferred: 2017-06-18 06:43:48 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:48 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:48 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:49 INFO [scrapy.crawler] Received SIGTERM, shutting down gracefully. Send again to force 2017-06-18 06:43:49 INFO [scrapy.core.engine] Closing spider (shutdown) 2017-06-18 06:43:49 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:49 ERROR Unhandled error in Deferred: 2017-06-18 06:43:49 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:49 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:49 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:51 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:51 ERROR Unhandled error in Deferred: 2017-06-18 06:43:51 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:51 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:51 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:52 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:52 ERROR Unhandled error in Deferred: 2017-06-18 06:43:52 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:52 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:52 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:53 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:53 ERROR Unhandled error in Deferred: 2017-06-18 06:43:53 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:53 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:53 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:55 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:55 ERROR Unhandled error in Deferred: 2017-06-18 06:43:55 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:55 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:55 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:55 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:55 ERROR Unhandled error in Deferred: 2017-06-18 06:43:55 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:55 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:55 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:56 INFO [scrapy.extensions.logstats] Crawled 96 pages (at 43 pages/min), scraped 0 items (at 0 items/min) 2017-06-18 06:43:58 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:43:58 ERROR Unhandled error in Deferred: 2017-06-18 06:43:58 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:43:58 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:43:58 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:44:00 ERROR [scrapy.core.scraper] Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 102, in iter_errback yield next(it) GeneratorExit 2017-06-18 06:44:00 ERROR Unhandled error in Deferred: 2017-06-18 06:44:00 CRITICAL [twisted] Unhandled error in Deferred: 2017-06-18 06:44:00 ERROR Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1199, in run self.mainLoop() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 1208, in mainLoop self.runUntilCurrent() File "/usr/local/lib/python3.6/site-packages/twisted/internet/base.py", line 828, in runUntilCurrent call.func(*call.args, **call.kw) File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 671, in _tick taskObj._oneWorkUnit() --- --- File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) builtins.TypeError: can't pickle HtmlElement objects 2017-06-18 06:44:00 CRITICAL [twisted] Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit result = next(self._iterator) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/defer.py", line 63, in work = (callable(elem, *args, **named) for elem in iterable) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output self.crawler.engine.crawl(request=output, spider=spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 210, in crawl self.schedule(request, spider) File "/usr/local/lib/python3.6/site-packages/scrapy/core/engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 57, in enqueue_request dqok = self._dqpush(request) File "/usr/local/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 86, in _dqpush self.dqs.push(reqd, -request.priority) File "/usr/local/lib/python3.6/site-packages/queuelib/pqueue.py", line 35, in push q.push(obj) # this may fail (eg. serialization error) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 15, in push s = serialize(obj) File "/usr/local/lib/python3.6/site-packages/scrapy/squeues.py", line 27, in _pickle_serialize return pickle.dumps(obj, protocol=2) TypeError: can't pickle HtmlElement objects 2017-06-18 06:44:02 INFO [scrapy.statscollectors] Dumping Scrapy stats: {'downloader/request_bytes': 257099, 'downloader/request_count': 164, 'downloader/request_method_count/GET': 164, 'downloader/response_bytes': 1752550, 'downloader/response_count': 164, 'downloader/response_status_count/200': 98, 'downloader/response_status_count/301': 65, 'downloader/response_status_count/302': 1, 'dupefilter/filtered': 65, 'finish_reason': 'shutdown', 'finish_time': datetime.datetime(2017, 6, 18, 6, 44, 2, 254), 'log_count/CRITICAL': 78, 'log_count/ERROR': 39, 'log_count/INFO': 10, 'memusage/max': 81223680, 'memusage/startup': 53161984, 'request_depth_max': 7, 'response_received_count': 98, 'scheduler/dequeued': 164, 'scheduler/dequeued/disk': 164, 'scheduler/enqueued': 1064, 'scheduler/enqueued/disk': 1064, 'spider_exceptions/GeneratorExit': 39, 'start_time': datetime.datetime(2017, 6, 18, 6, 41, 56, 217224)} 2017-06-18 06:44:02 INFO [scrapy.core.engine] Spider closed (shutdown) 2017-06-18 06:44:02 INFO (TCP Port 6023 Closed) 2017-06-18 06:44:02 INFO Main loop terminated.