Hi,
i'm trying to deploy a scrapy spider but i get the following error when i type shub deploy
(venv) (base) macbook@macbooks-MacBook-Pro gmaps % shub deploy
Packing version 1.0
Deploying to Scrapy Cloud project "566510"
Deploy log last 30 lines:
---> [Warning] Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap.
---> Running in 8f3fdff05c9e
Removing intermediate container 8f3fdff05c9e
---> 984c311877b2
Step 12/12 : ENV PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
---> Running in e77c21d190e7
Removing intermediate container e77c21d190e7
---> 2d8edfd8dec0
Successfully built 2d8edfd8dec0
Successfully tagged i.scrapinghub.com/kumo_project/566510:8
Step 1/3 : FROM alpine:3.5
---> f80194ae2e0c
Step 2/3 : ADD kumo-entrypoint /kumo-entrypoint
---> Using cache
---> fbd31fea70b1
Step 3/3 : RUN chmod +x /kumo-entrypoint
---> 3565017f646e
Successfully built 3565017f646e
Successfully tagged kumo-entrypoint:latest
Entrypoint container is created successfully
>>> Checking python dependencies
Collecting pip<20.0,>=9.0.3
WARNING: There're some errors when doing pip-check:
Could not find a version that satisfies the requirement pip<20.0,>=9.0.3 (from versions: )
No matching distribution found for pip<20.0,>=9.0.3
{"message": "Dependencies check exit code: 1", "details": "Pip checks failed, please fix the conflicts", "error": "requirements_error"}
{"status": "error", "message": "Requirements error"}
Deploy log location: /var/folders/z0/0f3bfffx6xx146gbvw_0wcmc0000gn/T/shub_deploy_4pbnv7n0.log
Error: Deploy failed: b'{"status": "error", "message": "Requirements error"}'
im using
python 3.9
scrapy 2.5.1
scrapy-splash 0.8.0
pip 19.3.1
and the scrapunghub.yml file contains this
project: 566510
how can i fix this?
thanks!
Your project is running on a very old stack "hworker:20160708", you need to deploy your project with an updated stack.
Follow the instructions here: https://support.zyte.com/support/solutions/articles/22000200402-changing-the-deploy-environment-with-scrapy-cloud-stacks
You'll need to modify your scrapinghub.yml file to be something like this:
project: xxxxxx
stack: scrapy:x.x
Available stacks can be found here: https://github.com/scrapinghub/scrapinghub-stack-scrapy/tags
thanks for your reply!
how do you know that my project is running on a very old stack?
i added this to the yml file
https://monosnap.com/file/Wa42eD8RfeXvWOmXO4pg1eGX9js0XK
but now im getting this error
Traceback (most recent call last):
File "/Users/macbook/PycharmProjects/gmaps/venv/bin/shub", line 8, in <module>
sys.exit(cli())
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 1137, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/shub/deploy.py", line 70, in cli
conf, image = load_shub_config(), None
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/shub/config.py", line 507, in load_shub_config
conf.load_file(closest_sh_yml)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/shub/config.py", line 133, in load_file
self.load(f)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/shub/config.py", line 87, in load
option_conf.update(yaml_option_conf)
ValueError: dictionary update sequence element #0 has length 1; 2 is required
any idea?
In your scrapinghub.yml, change "stacks" to "stack" (no quotes).
Also there is no 2.5.1 stack, you need to use the defined stack tags from this list https://github.com/scrapinghub/scrapinghub-stack-scrapy/tags , stack 2.5 uses scrapy 2.5.1
So it should be:
stack: scrapy:2.5
To see the stack used on each deploy, click on any of the deployments in your "Code & Deploys" section of your project
Also your organization was created before the specified time in the article I linked, so it is using that stack by default
oh ok thanks again!
it is working now!
really appreciate your help!
Elda Di Matteo
Hi,
i'm trying to deploy a scrapy spider but i get the following error when i type shub deploy
(venv) (base) macbook@macbooks-MacBook-Pro gmaps % shub deploy
Packing version 1.0
Deploying to Scrapy Cloud project "566510"
Deploy log last 30 lines:
---> [Warning] Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap.
---> Running in 8f3fdff05c9e
Removing intermediate container 8f3fdff05c9e
---> 984c311877b2
Step 12/12 : ENV PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
---> [Warning] Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap.
---> Running in e77c21d190e7
Removing intermediate container e77c21d190e7
---> 2d8edfd8dec0
Successfully built 2d8edfd8dec0
Successfully tagged i.scrapinghub.com/kumo_project/566510:8
Step 1/3 : FROM alpine:3.5
---> f80194ae2e0c
Step 2/3 : ADD kumo-entrypoint /kumo-entrypoint
---> Using cache
---> fbd31fea70b1
Step 3/3 : RUN chmod +x /kumo-entrypoint
---> Using cache
---> 3565017f646e
Successfully built 3565017f646e
Successfully tagged kumo-entrypoint:latest
Entrypoint container is created successfully
>>> Checking python dependencies
Collecting pip<20.0,>=9.0.3
WARNING: There're some errors when doing pip-check:
Could not find a version that satisfies the requirement pip<20.0,>=9.0.3 (from versions: )
No matching distribution found for pip<20.0,>=9.0.3
{"message": "Dependencies check exit code: 1", "details": "Pip checks failed, please fix the conflicts", "error": "requirements_error"}
{"status": "error", "message": "Requirements error"}
Deploy log location: /var/folders/z0/0f3bfffx6xx146gbvw_0wcmc0000gn/T/shub_deploy_4pbnv7n0.log
Error: Deploy failed: b'{"status": "error", "message": "Requirements error"}'
im using
python 3.9
scrapy 2.5.1
scrapy-splash 0.8.0
pip 19.3.1
and the scrapunghub.yml file contains this
project: 566510
how can i fix this?
thanks!
Hi,
Your project is running on a very old stack "hworker:20160708", you need to deploy your project with an updated stack.
Follow the instructions here: https://support.zyte.com/support/solutions/articles/22000200402-changing-the-deploy-environment-with-scrapy-cloud-stacks
You'll need to modify your scrapinghub.yml file to be something like this:
project: xxxxxx
stack: scrapy:x.x
Available stacks can be found here: https://github.com/scrapinghub/scrapinghub-stack-scrapy/tags
- Oldest First
- Popular
- Newest First
Sorted by Oldest Firstnestor
Hi,
Your project is running on a very old stack "hworker:20160708", you need to deploy your project with an updated stack.
Follow the instructions here: https://support.zyte.com/support/solutions/articles/22000200402-changing-the-deploy-environment-with-scrapy-cloud-stacks
You'll need to modify your scrapinghub.yml file to be something like this:
project: xxxxxx
stack: scrapy:x.x
Available stacks can be found here: https://github.com/scrapinghub/scrapinghub-stack-scrapy/tags
Elda Di Matteo
Hi,
thanks for your reply!
how do you know that my project is running on a very old stack?
i added this to the yml file
https://monosnap.com/file/Wa42eD8RfeXvWOmXO4pg1eGX9js0XK
but now im getting this error
(venv) (base) macbook@macbooks-MacBook-Pro gmaps % shub deploy
Traceback (most recent call last):
File "/Users/macbook/PycharmProjects/gmaps/venv/bin/shub", line 8, in <module>
sys.exit(cli())
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 1137, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/shub/deploy.py", line 70, in cli
conf, image = load_shub_config(), None
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/shub/config.py", line 507, in load_shub_config
conf.load_file(closest_sh_yml)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/shub/config.py", line 133, in load_file
self.load(f)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/shub/config.py", line 87, in load
option_conf.update(yaml_option_conf)
ValueError: dictionary update sequence element #0 has length 1; 2 is required
any idea?
thanks!
nestor
In your scrapinghub.yml, change "stacks" to "stack" (no quotes).
Also there is no 2.5.1 stack, you need to use the defined stack tags from this list https://github.com/scrapinghub/scrapinghub-stack-scrapy/tags , stack 2.5 uses scrapy 2.5.1
So it should be:
stack: scrapy:2.5
To see the stack used on each deploy, click on any of the deployments in your "Code & Deploys" section of your project
Also your organization was created before the specified time in the article I linked, so it is using that stack by default
Elda Di Matteo
oh ok thanks again!
it is working now!
really appreciate your help!
-
Unable to select Scrapy project in GitHub
-
ScrapyCloud can't call spider?
-
Unhandled error in Deferred
-
Item API - Filtering
-
newbie to web scraping but need data from zillow
-
ValueError: Invalid control character
-
Cancelling account
-
Best Practices
-
Beautifulsoup with ScrapingHub
-
Delete a project in ScrapingHub
See all 458 topics