No recent searches
Popular Articles
Sorry! nothing found for
Posted over 4 years ago by Flaschenfinder.de
Hello,
I try to deploy my spiders via Github to scrapinghub. But the deploy failes every time with the following error log:
Fetching changes remote: Enumerating objects: 3, done. remote: Counting objects: 33% (1/3) remote: Counting objects: 66% (2/3) remote: Counting objects: 100% (3/3) remote: Counting objects: 100% (3/3), done. remote: Compressing objects: 100% (1/1) remote: Compressing objects: 100% (1/1), done. remote: Total 2 (delta 1), reused 2 (delta 1), pack-reused 0 From https://github.com/CIC3RO/Crawler f733910..47b6ece master -> origin/master Getting data for a given refname Checking project tarball scrapinghub.yml is not found, assume it's a Scrapy project Setup build step for Scrapy project setup.py is not found, creating it from template Using default location for requirements.txt Login succeeded Building an image: Step 1/12 : FROM scrapinghub/scrapinghub-stack-scrapy:2.0 # Executing 5 build trigger s ---> Using cache ---> Using cache ---> Using cache ---> Using cache ---> Using cache ---> 18c33c8bcd01 Step 2/12 : ENV PYTHONUSERBASE=/app/python ---> Using cache ---> 65ee0a1ec403 Step 3/12 : ADD eggbased-entrypoint shub-build-egg shub-list-scripts /usr/local/sbin/ ---> Using cache ---> e224735cc599 Step 4/12 : ADD run-pipcheck /usr/local/bin/ ---> Using cache ---> 938c90ad3eb5 Step 5/12 : RUN chmod +x /usr/local/bin/run-pipcheck /usr/local/sbin/shub-build-egg /usr/local/sbin/shub-list-scripts /usr/local/sbin/eggbased-entrypoint ---> Using cache ---> 452ce24d37dc Step 6/12 : RUN ln -sf /usr/local/sbin/eggbased-entrypoint /usr/local/sbin/start-crawl && ln -sf /usr/local/sbin/eggbased-entrypoint /usr/local/sbin/scrapy-list && ln -sf /usr/local/sbin/eggbased-entrypoint /usr/local/sbin/shub-image-info && ln -sf /usr/local/sbin/eggbased-entrypoint /usr/local/sbin/run-pipcheck ---> Using cache ---> 1b43854be3d2 Step 7/12 : ADD requirements.txt /app/requirements.txt ---> Using cache ---> 3fefed40659b Step 8/12 : RUN mkdir $PYTHONUSERBASE && chown nobody:nogroup $PYTHONUSERBASE ---> Using cache ---> 3f5bc5b00ed5 Step 9/12 : RUN sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE -E PIP_NO_CACHE_DIR=0 pip install --user --no-cache-dir -r /app/requirements.txt ---> Using cache ---> 15d4368fb51b Step 10/12 : ADD project /tmp/project ---> b371111b998b Step 11/12 : RUN shub-build-egg /tmp/project ---> [Warning] Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap. ---> Running in 869bb14176a8 running bdist_egg running egg_info creating project.egg-info writing project.egg-info/PKG-INFO writing dependency_links to project.egg-info/dependency_links.txt writing entry points to project.egg-info/entry_points.txt writing top-level names to project.egg-info/top_level.txt writing manifest file 'project.egg-info/SOURCES.txt' reading manifest file 'project.egg-info/SOURCES.txt' writing manifest file 'project.egg-info/SOURCES.txt' installing library code to build/bdist.linux-x86_64/egg running install_lib running build_py creating build creating build/lib creating build/lib/Weinbrand_Conag copying Weinbrand_Conag/middlewares.py -> build/lib/Weinbrand_Conag copying Weinbrand_Conag/items.py -> build/lib/Weinbrand_Conag copying Weinbrand_Conag/pipelines.py -> build/lib/Weinbrand_Conag copying Weinbrand_Conag/__init__.py -> build/lib/Weinbrand_Conag copying Weinbrand_Conag/settings.py -> build/lib/Weinbrand_Conag creating build/lib/Weinbrand_Conag/spiders copying Weinbrand_Conag/spiders/Bolou.py -> build/lib/Weinbrand_Conag/spiders copying Weinbrand_Conag/spiders/Bargross.py -> build/lib/Weinbrand_Conag/spiders copying Weinbrand_Conag/spiders/__init__.py -> build/lib/Weinbrand_Conag/spiders creating build/bdist.linux-x86_64 creating build/bdist.linux-x86_64/egg creating build/bdist.linux-x86_64/egg/Weinbrand_Conag copying build/lib/Weinbrand_Conag/middlewares.py -> build/bdist.linux-x86_64/egg/Weinbrand_Conag creating build/bdist.linux-x86_64/egg/Weinbrand_Conag/spiders copying build/lib/Weinbrand_Conag/spiders/Bolou.py -> build/bdist.linux-x86_64/egg/Weinbrand_Conag/spiders copying build/lib/Weinbrand_Conag/spiders/Bargross.py -> build/bdist.linux-x86_64/egg/Weinbrand_Conag/spiders copying build/lib/Weinbrand_Conag/spiders/__init__.py -> build/bdist.linux-x86_64/egg/Weinbrand_Conag/spiders copying build/lib/Weinbrand_Conag/items.py -> build/bdist.linux-x86_64/egg/Weinbrand_Conag copying build/lib/Weinbrand_Conag/pipelines.py -> build/bdist.linux-x86_64/egg/Weinbrand_Conag copying build/lib/Weinbrand_Conag/__init__.py -> build/bdist.linux-x86_64/egg/Weinbrand_Conag copying build/lib/Weinbrand_Conag/settings.py -> build/bdist.linux-x86_64/egg/Weinbrand_Conag byte-compiling build/bdist.linux-x86_64/egg/Weinbrand_Conag/middlewares.py to middlewares.cpython-38.pyc byte-compiling build/bdist.linux-x86_64/egg/Weinbrand_Conag/spiders/Bolou.py to Bolou.cpython-38.pyc byte-compiling build/bdist.linux-x86_64/egg/Weinbrand_Conag/spiders/Bargross.py to Bargross.cpython-38.pyc byte-compiling build/bdist.linux-x86_64/egg/Weinbrand_Conag/spiders/__init__.py to __init__.cpython-38.pyc byte-compiling build/bdist.linux-x86_64/egg/Weinbrand_Conag/items.py to items.cpython-38.pyc byte-compiling build/bdist.linux-x86_64/egg/Weinbrand_Conag/pipelines.py to pipelines.cpython-38.pyc byte-compiling build/bdist.linux-x86_64/egg/Weinbrand_Conag/__init__.py to __init__.cpython-38.pyc byte-compiling build/bdist.linux-x86_64/egg/Weinbrand_Conag/settings.py to settings.cpython-38.pyc creating build/bdist.linux-x86_64/egg/EGG-INFO copying project.egg-info/PKG-INFO -> build/bdist.linux-x86_64/egg/EGG-INFO copying project.egg-info/SOURCES.txt -> build/bdist.linux-x86_64/egg/EGG-INFO copying project.egg-info/dependency_links.txt -> build/bdist.linux-x86_64/egg/EGG-INFO copying project.egg-info/entry_points.txt -> build/bdist.linux-x86_64/egg/EGG-INFO copying project.egg-info/top_level.txt -> build/bdist.linux-x86_64/egg/EGG-INFO �[91mzip_safe flag not set; analyzing archive contents... �[0m �[91mWeinbrand_Conag.__pycache__.settings.cpython-38: module references __file__ �[0m creating '/tmp/scrapinghub/project-1.0-py3.8.egg' and adding 'build/bdist.linux-x86_64/egg' to it removing 'build/bdist.linux-x86_64/egg' (and everything under it) Removing intermediate container 869bb14176a8 ---> 5bd06aada1b4 Step 12/12 : ENV PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin ---> [Warning] Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap. ---> Running in 655834d6180a Removing intermediate container 655834d6180a ---> b6d804d3fd5a Successfully built b6d804d3fd5a Successfully tagged i.scrapinghub.com/kumo_project/479387:8 Step 1/3 : FROM alpine:3.5 ---> f80194ae2e0c Step 2/3 : ADD kumo-entrypoint /kumo-entrypoint ---> Using cache ---> 95964d5dc19c Step 3/3 : RUN chmod +x /kumo-entrypoint ---> Using cache ---> d2b23632bddf Successfully built d2b23632bddf Successfully tagged kumo-entrypoint:latest Entrypoint container is created successfully >>> Checking python dependencies Collecting pip<20.0,>=9.0.3 Downloading pip-19.3.1-py2.py3-none-any.whl (1.4 MB) Installing collected packages: pip Successfully installed pip-19.3.1 No broken requirements found. WARNING: There're some errors when doing pip-check: WARNING: The scripts pip, pip3 and pip3.8 are installed in '/app/python/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. >>> Getting spiders list: >>> Trying to get spiders from shub-image-info command >> Found 2 spider(s): ["www.bargross.de", "www.bolou.de"] >>> Getting scripts list: {"scripts": []} The push refers to repository [i.scrapinghub.com/kumo_project/479387] d8550d7e0a41: Preparing a18a1560ecf6: Preparing b0a66cb0d58d: Preparing 8806a3e19873: Preparing 975662b23c84: Preparing 09c31845dc9f: Preparing b87c4b096670: Preparing dc5e9c669e0d: Preparing b83f3896f5db: Preparing bf34ccf18e48: Preparing 9965d3d55939: Preparing ffd580ad3e4d: Preparing d73a55f69969: Preparing 597348be90c9: Preparing 07cf959c7f23: Preparing fccc9f1a5e4e: Preparing d032a72fa56d: Preparing fe108eef54ea: Preparing df4dc71f749c: Preparing 64b4e3ecc0d6: Preparing bdc3a0723efa: Preparing f2cb0ecef392: Preparing bf34ccf18e48: Waiting b87c4b096670: Waiting 9965d3d55939: Waiting dc5e9c669e0d: Waiting b83f3896f5db: Waiting bdc3a0723efa: Waiting fe108eef54ea: Waiting ffd580ad3e4d: Waiting 64b4e3ecc0d6: Waiting df4dc71f749c: Waiting 07cf959c7f23: Waiting f2cb0ecef392: Waiting d73a55f69969: Waiting d032a72fa56d: Waiting 597348be90c9: Waiting fccc9f1a5e4e: Waiting 09c31845dc9f: Waiting a18a1560ecf6: Pushing [> ] 1.024kB/89.34kB d8550d7e0a41: Pushing [> ] 512B/31.7kB a18a1560ecf6: Pushing [==================================================>] 110.6kB d8550d7e0a41: Pushing [==================================================>] 34.82kB b0a66cb0d58d: Layer already exists 8806a3e19873: Layer already exists 975662b23c84: Layer already exists b87c4b096670: Layer already exists 09c31845dc9f: Layer already exists dc5e9c669e0d: Layer already exists 9965d3d55939: Layer already exists b83f3896f5db: Layer already exists ffd580ad3e4d: Layer already exists bf34ccf18e48: Layer already exists d73a55f69969: Layer already exists d8550d7e0a41: Pushed 597348be90c9: Layer already exists a18a1560ecf6: Pushed 07cf959c7f23: Layer already exists fe108eef54ea: Layer already exists fccc9f1a5e4e: Layer already exists d032a72fa56d: Layer already exists df4dc71f749c: Layer already exists 64b4e3ecc0d6: Layer already exists bdc3a0723efa: Layer already exists f2cb0ecef392: Layer already exists 8: digest: sha256:bf5680a4aa707ae0d560b0e5e202b714d9ed06d358e8a8e4ad8149821c80d776 size: 4919 {"message": "", "error": "internal_error"}
0 Votes
0 Comments
Login to post a comment
People who like this
This post will be deleted permanently. Are you sure?
Hello,
I try to deploy my spiders via Github to scrapinghub. But the deploy failes every time with the following error log:
0 Votes
0 Comments
Login to post a comment