I make some test that i find my issue is Exceeded container timeout 60s
Because when i create a simple project (just 2 spiders), i can deploy the project with my settings.
But when i create my projects with a lots of spiders and deploy it to scrapinghub it shows the error Exceeded container timeout 60s. Then i can see my pymongo saved part of the data on the server.
Why ? How can i fix this issue ?
Any help would be appreciated . Thank in advance.
Just 2 spiders deploy succeed.
Add a lots of spider when i deploy it , it save data to my mongodb on the server. But it deploy failed finally.
Here is the logcat:
Login succeeded Building an image: Step 1/12 : FROM scrapinghub/scrapinghub-stack-scrapy:1.4 �[91m# Executing 2 build triggers... �[0m Step 1/1 : ENV PIP_TRUSTED_HOST $PIP_TRUSTED_HOST PIP_INDEX_URL $PIP_INDEX_URL ---> Using cache Step 1/1 : RUN test -n $APT_PROXY && echo 'Acquire::http::Proxy \"$APT_PROXY\";' >/etc/apt/apt.conf.d/proxy ---> Using cache ---> 09db15b62bc7 Step 2/12 : ENV PYTHONUSERBASE /app/python ---> Using cache ---> f930abf17209 Step 3/12 : ADD eggbased-entrypoint /usr/local/sbin/ ---> Using cache ---> 81edef949918 Step 4/12 : ADD run-pipcheck /usr/local/bin/ ---> Using cache ---> 7b3f6f2adddd Step 5/12 : RUN chmod +x /usr/local/bin/run-pipcheck ---> Using cache ---> 74ea0969bfb1 Step 6/12 : RUN chmod +x /usr/local/sbin/eggbased-entrypoint && ln -sf /usr/local/sbin/eggbased-entrypoint /usr/local/sbin/start-crawl && ln -sf /usr/local/sbin/eggbased-entrypoint /usr/local/sbin/scrapy-list && ln -sf /usr/local/sbin/eggbased-entrypoint /usr/local/sbin/shub-image-info && ln -sf /usr/local/sbin/eggbased-entrypoint /usr/local/sbin/run-pipcheck ---> Using cache ---> 4363719ccc55 Step 7/12 : ADD requirements.txt /app/requirements.txt ---> Using cache ---> 73d9d085419b Step 8/12 : RUN mkdir /app/python && chown nobody:nogroup /app/python ---> Using cache ---> fb44fb66623b Step 9/12 : RUN sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE pip install --user --no-cache-dir -r /app/requirements.txt ---> Using cache ---> 645b05197a2b Step 10/12 : COPY *.egg /app/ ---> 19a74c56cac6 Removing intermediate container 8f8114d5d7f7 Step 11/12 : RUN if [ -d "/app/addons_eggs" ]; then rm -f /app/*.dash-addon.egg; fi ---> Running in 86e243c7409e ---> 6cd52cf8e0ca Removing intermediate container 86e243c7409e Step 12/12 : ENV PATH /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin ---> Running in 39a7d28d6ce0 ---> 6f678ebd9a83 Removing intermediate container 39a7d28d6ce0 Successfully built 6f678ebd9a83 Step 1/3 : FROM alpine:3.5 ---> 6c6084ed97e5 Step 2/3 : ADD kumo-entrypoint /kumo-entrypoint ---> Using cache ---> 8191949c871c Step 3/3 : RUN chmod +x /kumo-entrypoint ---> Using cache ---> a77c03e9ec1e Successfully built a77c03e9ec1e Entrypoint container is created successfully >>> Checking python dependencies No broken requirements found. >>> Getting spiders list: >>> Trying to get spiders from shub-image-info command WARNING: There're some errors on shub-image-info call: Exceeded container timeout 60s {"message": "shub-image-info exit code: -1", "details": null, "error": "image_info_error"}
Hi,
I deploy my project succeed. But when i create requirements.txt and scrapinghub.yml in my project and deploy it .
I get error
Error: Deploy failed: {"status": "error", "message": "Internal error"}
This is log show:
Successfully built a77c03e9ec1e Entrypoint container is created successfully >>> Checking python dependencies No broken requirements found. >>> Getting spiders list: >>> Trying to get spiders from shub-image-info command WARNING: There're some errors on shub-image-info call: Exceeded container timeout 60s {"message": "shub-image-info exit code: -1", "details": null, "error": "image_info_error"}
I had deployed my project with requirements.txt and scrapinghub.yml before and it was working.
Why i can't deploy my project again with the same scrapinghub.yml and requirements.txt ?
I check my project number and pymongo version is correct.
Any idea ?
Thanks in advance.
0 Votes
motogod19 posted over 6 years ago Best Answer
https://github.com/scrapinghub/shub/issues/273
0 Votes
2 Comments
motogod19 posted over 6 years ago
I make some test that i find my issue is Exceeded container timeout 60s
Because when i create a simple project (just 2 spiders), i can deploy the project with my settings.
But when i create my projects with a lots of spiders and deploy it to scrapinghub it shows the error Exceeded container timeout 60s. Then i can see my pymongo saved part of the data on the server.
Why ? How can i fix this issue ?
Any help would be appreciated . Thank in advance.
Just 2 spiders deploy succeed.
Add a lots of spider when i deploy it , it save data to my mongodb on the server. But it deploy failed finally.
Here is the logcat:
0 Votes
motogod19 posted over 6 years ago Answer
https://github.com/scrapinghub/shub/issues/273
0 Votes
Login to post a comment