Can't deploy to Scrapy cloud, problem with custom modules
Posted over 6 years ago by monkeybot
Post a topicPeople who like this
Delete Comment
This post will be deleted permanently. Are you sure?
Code Snippet
Html
Html
Css
JavaScript
Sass
Xml
Ruby
PHP
Java
C#
C++
ObjectiveC
Perl
Python
VB
SQL
Generic Language
I am attempting to deploy a spider to Scrapy cloud, but I am repeatedly running into requirements problems. I am using Python 3.5. My `scrapinghub.yml` file contains these lines: projects: default: 358310 stacks: default: scrapy:1.3-py3 requirements: file: requirements.txt My `requirements.txt` file contains these lines: geth_doc_miner==1.0 geth_feature_detector==1.0 geth_indexer==1.0 geth_synset==1.0 indexer==1.0 comparse==1.0 file_io==1.0 This is the error I keep getting:
Where am I going wrong? BTW, I have the latest version of pip installed (contrary to what the error message states).
0 Votes
2 Comments
monkeybot posted over 6 years ago
Those are all custom python scripts that contain methods and functions that the spider calls on. For instance, if the spider comes across a PDF file, it’ll call a PDF parser from a separate script to parse the file.
I read the help files about deploying custom Python scripts to scrapinghub, but it does not state how to call a custom script from another, for instance, if I had a custom Python script called module_B which called a method from module_A like this:
Do I simply call modules like above or is there anything else I have to do?
0 Votes
nestor posted over 6 years ago Admin
I couldn't find any of those packages in pypi, that's why pip fails trying to install it in your Scrapy Cloud project.
Note: I've trimmed your log to the relevant part, please use code block next time.
0 Votes
Login to post a comment