twisted.internet.defer._DefGen_Return

Posted over 5 years ago by Artem M

Post a topic
Answered
A
Artem M

Getting this strange error message.

Assume it has smth to do with payload. Though all works fine locally on windows machine.

[scrapy.core.scraper] Spider error processing <POST http://oris.co.palm-beach.fl.us/or_web1/new_sch.asp> (referer: http://oris.co.palm-beach.fl.us/or_web1/)

 Less

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
    result = g.send(result)
  File "/usr/local/lib/python3.6/site-packages/scrapy/core/downloader/middleware.py", line 43, in process_request
    defer.returnValue((yield download_func(request=request,spider=spider)))
  File "/usr/local/lib/python3.6/site-packages/twisted/internet/defer.py", line 1276, in returnValue
    raise _DefGen_Return(val)
twisted.internet.defer._DefGen_Return: <200 http://oris.co.palm-beach.fl.us/or_web1/new_sch.asp>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/scrapy/core/spidermw.py", line 42, in process_spider_input
    result = method(response=response, spider=spider)
  File "/usr/local/lib/python3.6/site-packages/scrapy_pagestorage.py", line 68, in process_spider_input
    self.save_response(response, spider)
  File "/usr/local/lib/python3.6/site-packages/scrapy_pagestorage.py", line 102, in save_response
    self._writer.write(payload)
  File "/usr/local/lib/python3.6/site-packages/scrapinghub/hubstorage/batchuploader.py", line 224, in write
    data = jsonencode(item)
  File "/usr/local/lib/python3.6/site-packages/scrapinghub/hubstorage/serialization.py", line 38, in jsonencode
    return dumps(o, default=jsondefault)
  File "/usr/local/lib/python3.6/json/__init__.py", line 238, in dumps
    **kw).encode(obj)
  File "/usr/local/lib/python3.6/json/encoder.py", line 199, in encode
    chunks = self.iterencode(o, _one_shot=True)
  File "/usr/local/lib/python3.6/json/encoder.py", line 257, in iterencode
    return _iterencode(o, 0)
TypeError: keys must be a string


0 Votes

A

Artem M posted over 5 years ago Best Answer

ok. the issue was with - Messagepack is not available, and page storage enabled for this project.

I have disabled pagestorage and it works fine now.

I wished Error messages were more readable.

0 Votes


10 Comments

Sorted by
aurish_hammad_hafeez

aurish_hammad_hafeez posted over 5 years ago Admin

Great to see that you got it working and thanks for sharing the solution.

0 Votes

A

Artem M posted over 5 years ago Answer

ok. the issue was with - Messagepack is not available, and page storage enabled for this project.

I have disabled pagestorage and it works fine now.

I wished Error messages were more readable.

0 Votes

A

Artem M posted over 5 years ago

Nope. Had nothing to do with init.

0 Votes

A

Artem M posted over 5 years ago

@aurish_hammad_hafeez

My local scrapy version is 1.5.1

So I updated the stack to 1.5

Getting same result.

I do not have init method in my spider and pass input variable directly. Can be this a problem in the cloud?

0 Votes

aurish_hammad_hafeez

aurish_hammad_hafeez posted over 5 years ago Admin

I checked your project and it is set to 

scrapy:1.3-py3


I think difference in version is causing problem, kindly check the versions used locally and then set it accordingly following https://support.scrapinghub.com/support/solutions/articles/22000200402-changing-the-deploy-environment-with-scrapy-cloud-stacks


0 Votes

aurish_hammad_hafeez

aurish_hammad_hafeez posted over 5 years ago Admin

sorry for the delay, I don't see any problem with the code. As its working correctly on your local machine then can you share which python and scrapy version that you are using locally

0 Votes

g

grajagopalan posted over 5 years ago Admin

Sorry about the delay in response. We are not able to provide a whole lot of assistance with coding. We will get back to you this week if we have a solution for you. 

0 Votes

A

Artem M posted over 5 years ago

@aurish_hammad_hafeez

 From And To dates are passed as parameters when starting the spider.

def parse(self, response):
        # inspect_response(response, self)
        
        url = 'http://oris.co.palm-beach.fl.us/or_web1/new_sch.asp'
        headers = {
                        'upgrade-insecure-requests': "1",
                        'user-agent': "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.86 Safari/537.36",
                        'origin': "http://oris.co.palm-beach.fl.us",
                        'content-type': "application/x-www-form-urlencoded",
                        'dnt': "1",
                        'accept': "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3",
                        'cache-control': "no-cache",
                                                }
        # Date range should be visin 90 days
        data = {'FromDate': self.FromDate,
                     'PageSize': '500',
                     'RecSetSize': '500',
                     'ToDate': self.ToDate,
                     'consideration': '',
                     'search_by': 'DocType',
                     'search_entry': 'LP'}                                            
        body = urlencode(data)                                         
        yield scrapy.Request(url, method="POST", headers = headers, body = body,  callback = self.parsed)

 

0 Votes

thriveni

thriveni posted over 5 years ago Admin

0 Votes

aurish_hammad_hafeez

aurish_hammad_hafeez posted over 5 years ago Admin

can you share some more detail regarding it, some code snippet etc 

1 Votes

Login to post a comment