Login succeeded Building an image: Step 1/12 : FROM scrapinghub/scrapinghub-stack-scrapy:2.0 # Executing 5 build trigger s ---> Using cache ---> Using cache ---> Using cache ---> Using cache ---> Using cache ---> 18c33c8bcd01 Step 2/12 : ENV PYTHONUSERBASE=/app/python ---> Using cache ---> 65ee0a1ec403 Step 3/12 : ADD eggbased-entrypoint /usr/local/sbin/ ---> Using cache ---> c88e36e9f99f Step 4/12 : ADD run-pipcheck /usr/local/bin/ ---> Using cache ---> 623d7151f134 Step 5/12 : RUN chmod +x /usr/local/bin/run-pipcheck ---> Using cache ---> 5daa704cbc3a Step 6/12 : RUN chmod +x /usr/local/sbin/eggbased-entrypoint && ln -sf /usr/local/sbin/eggbased-entrypoint /usr/local/sbin/start-crawl && ln -sf /usr/local/sbin/eggbased-entrypoint /usr/local/sbin/scrapy-list && ln -sf /usr/local/sbin/eggbased-entrypoint /usr/local/sbin/shub-image-info && ln -sf /usr/local/sbin/eggbased-entrypoint /usr/local/sbin/run-pipcheck ---> Using cache ---> cfd18c311fc0 Step 7/12 : ADD requirements.txt /app/requirements.txt ---> Using cache ---> 001d04909b43 Step 8/12 : RUN mkdir /app/python && chown nobody:nogroup /app/python ---> Using cache ---> 6d8049ac1630 Step 9/12 : RUN sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE -E PIP_NO_CACHE_DIR=0 pip install --user --no-cache-dir -r /app/requirements.txt ---> Using cache ---> 239ae4d46a5d Step 10/12 : COPY *.egg /app/ ---> e6c9ff4eafe9 Step 11/12 : RUN if [ -d "/app/addons_eggs" ]; then rm -f /app/*.dash-addon.egg; fi ---> [Warning] Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap. ---> Running in ba8db17a653c Removing intermediate container ba8db17a653c ---> 90bea7fa081d Step 12/12 : ENV PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin ---> [Warning] Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap. ---> Running in b0293aaa00a1 Removing intermediate container b0293aaa00a1 ---> 1758cfdccf3a Successfully built 1758cfdccf3a Successfully tagged i.scrapinghub.com/kumo_project/474077:22 Step 1/3 : FROM alpine:3.5 ---> f80194ae2e0c Step 2/3 : ADD kumo-entrypoint /kumo-entrypoint ---> Using cache ---> f752570fe999 Step 3/3 : RUN chmod +x /kumo-entrypoint ---> Using cache ---> e3a4a4d00b61 Successfully built e3a4a4d00b61 Successfully tagged kumo-entrypoint:latest Entrypoint container is created successfully >>> Checking python dependencies Collecting pip<20.0,>=9.0.3 Downloading pip-19.3.1-py2.py3-none-any.whl (1.4 MB) Installing collected packages: pip Successfully installed pip-19.3.1 No broken requirements found. WARNING: There're some errors when doing pip-check: WARNING: The scripts pip, pip3 and pip3.8 are installed in '/app/python/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. >>> Getting spiders list: >>> Trying to get spiders from shub-image-info command WARNING: There're some errors on shub-image-info call: ERROR:root:Job runtime exception Traceback (most recent call last): File "/usr/local/lib/python3.8/site-packages/sh_scrapy/crawl.py", line 148, in _run_usercode _run(args, settings) File "/usr/local/lib/python3.8/site-packages/sh_scrapy/crawl.py", line 103, in _run _run_scrapy(args, settings) File "/usr/local/lib/python3.8/site-packages/sh_scrapy/crawl.py", line 111, in _run_scrapy execute(settings=settings) File "/app/python/lib/python3.8/site-packages/scrapy/cmdline.py", line 144, in execute cmd.crawler_process = CrawlerProcess(settings) File "/app/python/lib/python3.8/site-packages/scrapy/crawler.py", line 280, in __init__ super(CrawlerProcess, self).__init__(settings) File "/app/python/lib/python3.8/site-packages/scrapy/crawler.py", line 152, in __init__ self.spider_loader = self._get_spider_loader(settings) File "/app/python/lib/python3.8/site-packages/scrapy/crawler.py", line 146, in _get_spider_loader return loader_cls.from_settings(settings.frozencopy()) File "/app/python/lib/python3.8/site-packages/scrapy/spiderloader.py", line 68, in from_settings return cls(settings) File "/app/python/lib/python3.8/site-packages/scrapy/spiderloader.py", line 24, in __init__ self._load_all_spiders() File "/app/python/lib/python3.8/site-packages/scrapy/spiderloader.py", line 51, in _load_all_spiders for module in walk_modules(name): File "/app/python/lib/python3.8/site-packages/scrapy/utils/misc.py", line 78, in walk_modules submod = import_module(fullpath) File "/usr/local/lib/python3.8/importlib/__init__.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1014, in _gcd_import File "", line 991, in _find_and_load File "", line 975, in _find_and_load_unlocked File "", line 671, in _load_unlocked File "", line 783, in exec_module File "", line 219, in _call_with_frames_removed File "/tmp/unpacked-eggs/__main__.egg/bargross/spiders/Bargross.py", line 42, in locale.setlocale(locale.LC_ALL, 'german') File "/usr/local/lib/python3.8/locale.py", line 608, in setlocale return _setlocale(category, locale) locale.Error: unsupported locale setting Traceback (most recent call last): File "/usr/local/bin/shub-image-info", line 8, in sys.exit(shub_image_info()) File "/usr/local/lib/python3.8/site-packages/sh_scrapy/crawl.py", line 209, in shub_image_info _run_usercode(None, ['scrapy', 'shub_image_info'] + sys.argv[1:], File "/usr/local/lib/python3.8/site-packages/sh_scrapy/crawl.py", line 148, in _run_usercode _run(args, settings) File "/usr/local/lib/python3.8/site-packages/sh_scrapy/crawl.py", line 103, in _run _run_scrapy(args, settings) File "/usr/local/lib/python3.8/site-packages/sh_scrapy/crawl.py", line 111, in _run_scrapy execute(settings=settings) File "/app/python/lib/python3.8/site-packages/scrapy/cmdline.py", line 144, in execute cmd.crawler_process = CrawlerProcess(settings) File "/app/python/lib/python3.8/site-packages/scrapy/crawler.py", line 280, in __init__ super(CrawlerProcess, self).__init__(settings) File "/app/python/lib/python3.8/site-packages/scrapy/crawler.py", line 152, in __init__ self.spider_loader = self._get_spider_loader(settings) File "/app/python/lib/python3.8/site-packages/scrapy/crawler.py", line 146, in _get_spider_loader return loader_cls.from_settings(settings.frozencopy()) File "/app/python/lib/python3.8/site-packages/scrapy/spiderloader.py", line 68, in from_settings return cls(settings) File "/app/python/lib/python3.8/site-packages/scrapy/spiderloader.py", line 24, in __init__ self._load_all_spiders() File "/app/python/lib/python3.8/site-packages/scrapy/spiderloader.py", line 51, in _load_all_spiders for module in walk_modules(name): File "/app/python/lib/python3.8/site-packages/scrapy/utils/misc.py", line 78, in walk_modules submod = import_module(fullpath) File "/usr/local/lib/python3.8/importlib/__init__.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1014, in _gcd_import File "", line 991, in _find_and_load File "", line 975, in _find_and_load_unlocked File "", line 671, in _load_unlocked File "", line 783, in exec_module File "", line 219, in _call_with_frames_removed File "/tmp/unpacked-eggs/__main__.egg/bargross/spiders/Bargross.py", line 42, in locale.setlocale(locale.LC_ALL, 'german') File "/usr/local/lib/python3.8/locale.py", line 608, in setlocale return _setlocale(category, locale) locale.Error: unsupported locale setting {"message": "shub-image-info exit code: 1", "details": null, "error": "image_info_error"}