scrapy FormRequest is not working when I use crawlera else it works fine!

Posted about 7 years ago by howard1

Post a topic
Answered
h
howard1

Hi everyone. I just started using crawlera.

I have a scraper written in python scrapy, it goes to a url, logs in and fill search forms and scrape data. And it works fine without crawlera. When I use crawlera middleware, it can't log in to the website.

Please help me in this

0 Votes

nestor

nestor posted almost 7 years ago Admin Best Answer

You'll probably need to make use of Crawlera Sessions if you need to retain the same IP after login in; by default Crawlera routes every request through a different IP so the login cookie is probably lost. For more on Crawlera Sessions, please see: https://doc.scrapinghub.com/crawlera.html#sessions.


Also if you wish to handle cookies on your side, you can use "X-Crawlera-Cookies: disable" header (https://doc.scrapinghub.com/crawlera.html#x-crawlera-cookies)

0 Votes


1 Comments

nestor

nestor posted almost 7 years ago Admin Answer

You'll probably need to make use of Crawlera Sessions if you need to retain the same IP after login in; by default Crawlera routes every request through a different IP so the login cookie is probably lost. For more on Crawlera Sessions, please see: https://doc.scrapinghub.com/crawlera.html#sessions.


Also if you wish to handle cookies on your side, you can use "X-Crawlera-Cookies: disable" header (https://doc.scrapinghub.com/crawlera.html#x-crawlera-cookies)

0 Votes

Login to post a comment