Start a new topic
Answered

Run Fail ImportError: cannot import name log

I saw a similar post like this and for the life me cannot figure out the issue. I set the yml file as per the shub documents and based on the other post I imported logging which did not help and not sure why that needs to be done. I am setting the scrapy stack based on my local using python 3.6 and scrapy 1.7. The error is showing python 2.7 I am using the free version of scrapy cloud not sure if this has something to do with it.


Please see my yml file settings below.


yml file

 

projects:

     default: 403670

requirements:

    file: requirements.txt

stacks:

    scrapy: 1.7-py3


This is the error:

 File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 165, in _launch
    from sh_scrapy.log import initialize_logging
  File "/usr/local/lib/python2.7/site-packages/sh_scrapy/log.py", line 7, in <module>
    from scrapy import log, __version__

ImportError: cannot import name log



Best Answer

Your project is not being deployed with 1.7 because it should be "stack" not "stacks" on your yml file.



Answer

Your project is not being deployed with 1.7 because it should be "stack" not "stacks" on your yml file.



1 person likes this

I see and so sorry you are write as per shub documentation it is stack. However, when I make the change and try to use scrapy 1.6 I am getting the below error. I have changed my yml and requirements.txt file to use scrapy 1.6 but no luck. I have looked over the documentation and dont see what I am doing wrong. 


Error message after running shub deploy 403670

Error: Deploy failed (400):

stack: Invalid value


My yml file

 

projects:

default: 403670

requirements:

file: requirements.txt

stack:

scrapy: 1.7-py3



Solved the issue! There was a space between scrapy: 1.7. Changed the yml file to the below and with worked! 


In case anyone else has the same issues. Remove the space and make sure the scrapy version in your requirements.txt file matches with the version on your yml file.


 

projects:

default: 403670

requirements:

file: requirements.txt

stack:

scrapy:1.6-py3

Nice, I'm glad it is working now.

Btw, stack scrapy1.7-py3 is now available too.

Please #coffeeincodeout it's don't work for me I try but nothing can I give me your yml file.

akouayao2017@gmx.com

Thanks

Hey, sorry man try posting your issue on the forum and I will do my best to help you.

projects:
default: 404158
requirements:
file: requirements.txt
stack:
scrapy:1.7-py3

scrapy==1.7.3
firebase_admin==2.17.0

And I gave this error 
File "/app/python/lib/python2.7/site-packages/firebase_admin/credentials.py", line 83, in __init__
    with open(cert) as json_file:
IOError: [Errno 2] No such file or directory: '../Linfodrome/lalo_firebase.json'



Thanks I found a solution for my issue (Firebase json file) and I share it now

CREDENTIALS = {
"type": "service_account",
"project_id": "xxx",
"private_key_id": "your private_key_id", all your firebase.json item}
cred = credentials.Certificate(self.CREDENTIALS) firebase_admin.initialize_app(cred)


So it was the json file settings for firebase causing the issue. Thank you for sharing.



1 person likes this
Login to post a comment