We are using custom python scripts for webscraping in our project. Currently they are running successfully in production. We would like to migrate our scripts into scrapy cloud and use scrapy cloud as permanent platform for our projects. Currently we purchased one scrapy cloud unit to test its features. In future we would like to purchase more scrapy cloud units and also crawlera based on our requirements. Currently we would like to test couple of our scripts in scrapy cloud platform and we are planning to deploy them using docker feature which is supported by scrapy cloud. As per documentation we understand that its possible to deploy only scrapy spiders using docker. Is that correct? or we could deploy our custom scripts also using docker into scrapy cloud platform. - If this possible, could you assist us with deploying one of our custom script into scrapy cloud using docker ?
Hello Team,
We are using custom python scripts for webscraping in our project. Currently they are running successfully in production. We would like to migrate our scripts into scrapy cloud and use scrapy cloud as permanent platform for our projects. Currently we purchased one scrapy cloud unit to test its features. In future we would like to purchase more scrapy cloud units and also crawlera based on our requirements. Currently we would like to test couple of our scripts in scrapy cloud platform and we are planning to deploy them using docker feature which is supported by scrapy cloud. As per documentation we understand that its possible to deploy only scrapy spiders using docker. Is that correct? or we could deploy our custom scripts also using docker into scrapy cloud platform. - If this possible, could you assist us with deploying one of our custom script into scrapy cloud using docker ?
0 Votes
0 Comments
Login to post a comment