Start a new topic

Callback is not working.

# -*- coding: utf-8 -*-
import scrapy

class QuotesSpider(scrapy.Spider):
name = 'quotes'
allowed_domains = ['']
start_urls = ['']

def parse(self, response):
self.log("I just visited " +response.url)
for quotes in response.css('div.quote'):

item = {'authorname': quotes.css('').extract_first(),
'text': quotes.css('span.text::text').extract_first(),
'tags': quotes.css('a.tag::text').extract()

yield item
# #now paging forward link starts

next_page_url = response.css('>a::attr(href)').extract_first()
if next_page_url:
next_page_url = response.urljoin(next_page_url)
yield scrapy.Request(url=next_page_url,callback=self.parse) Now above code shows no error. BUt the problem is that the callback in yield function is not repeating. I have checked that next_page_url is working fine and giving correct url for the next page. But after scraping a single page it stops scrapping.
Login to post a comment