Start a new topic

builtins.ValueError: Method 'f1' not found in: <amazon_spider 'amazon

 I've a decorator function:

def check_response(f):
    '''decorator for the parse functions. Checks if we're banned or not.'''
    def f1(self, response):
        ...
    return f1

 And I use it for parse callbacks:

    @check_response
    def parse_qa(self, response):        

It works ok locally, but emits the following error on ScrapingHub:

Traceback (most recent call last):
  File "/app/python/lib/python3.6/site-packages/scrapy/commands/crawl.py", line 58, in run
    self.crawler_process.start()
  File "/app/python/lib/python3.6/site-packages/scrapy/crawler.py", line 285, in start
    reactor.run(installSignalHandlers=False)  # blocking call
  File "/app/python/lib/python3.6/site-packages/twisted/internet/base.py", line 1243, in run
    self.mainLoop()
  File "/app/python/lib/python3.6/site-packages/twisted/internet/base.py", line 1252, in mainLoop
    self.runUntilCurrent()
--- <exception caught here> ---
  File "/app/python/lib/python3.6/site-packages/twisted/internet/base.py", line 878, in runUntilCurrent
    call.func(*call.args, **call.kw)
  File "/app/python/lib/python3.6/site-packages/scrapy/utils/reactor.py", line 41, in __call__
    return self._func(*self._a, **self._kw)
  File "/app/python/lib/python3.6/site-packages/scrapy/core/engine.py", line 122, in _next_request
    if not self._next_request_from_scheduler(spider):
  File "/app/python/lib/python3.6/site-packages/scrapy/core/engine.py", line 149, in _next_request_from_scheduler
    request = slot.scheduler.next_request()
  File "/app/python/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 71, in next_request
    request = self._dqpop()
  File "/app/python/lib/python3.6/site-packages/scrapy/core/scheduler.py", line 108, in _dqpop
    return request_from_dict(d, self.spider)
  File "/app/python/lib/python3.6/site-packages/scrapy/utils/reqser.py", line 50, in request_from_dict
    cb = _get_method(spider, cb)
  File "/app/python/lib/python3.6/site-packages/scrapy/utils/reqser.py", line 87, in _get_method
    raise ValueError("Method %r not found in: %s" % (name, obj))
builtins.ValueError: Method 'f1' not found in: <amazon_spider 'amazon' at 0x7f82b45d2f60>


I've created a minimal example. So from start_requests returning a list works, while yielding doesn't. In the callback function neither method works.

Again, it's only on ScrapyCloud, locally everything works.

 

py
(1.2 KB)

Hello Pablo.


What are you talking about? What projects? What support portal?

Login to post a comment