JHonvadoe / job_statistic

job_statistic_data


Contributors JHonvadoe

Last run failed with status code 1.

Console output of last run

Injecting configuration and compiling... [1G [1G-----> Python app detected [1G ! The latest version of Python 3 is python-3.6.2 (you are using python-3.6.4, which is unsupported). [1G ! We recommend upgrading by specifying the latest version (python-3.6.2). [1G Learn More: https://devcenter.heroku.com/articles/python-runtimes [1G-----> Installing python-3.6.4 [1G-----> Installing pip [1G-----> Installing requirements with pip [1G Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6)) [1G Cloning http://github.com/openaustralia/scraperwiki-python.git (to morph_defaults) to /app/.heroku/src/scraperwiki [1G Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 8)) [1G Downloading lxml-3.4.4.tar.gz (3.5MB) [1G Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 9)) [1G Downloading cssselect-0.9.1.tar.gz [1G Collecting requests-html (from -r /tmp/build/requirements.txt (line 10)) [1G Downloading requests_html-0.8.2-py2.py3-none-any.whl [1G Collecting beautifulsoup4 (from -r /tmp/build/requirements.txt (line 11)) [1G Downloading beautifulsoup4-4.6.0-py3-none-any.whl (86kB) [1G Collecting fake-useragent (from -r /tmp/build/requirements.txt (line 12)) [1G Downloading fake-useragent-0.1.10.tar.gz [1G Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading dumptruck-0.1.6.tar.gz [1G Collecting requests (from scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading requests-2.18.4-py2.py3-none-any.whl (88kB) [1G Collecting bs4 (from requests-html->-r /tmp/build/requirements.txt (line 10)) [1G Downloading bs4-0.0.1.tar.gz [1G Collecting w3lib (from requests-html->-r /tmp/build/requirements.txt (line 10)) [1G Downloading w3lib-1.19.0-py2.py3-none-any.whl [1G Collecting pyppeteer (from requests-html->-r /tmp/build/requirements.txt (line 10)) [1G Downloading pyppeteer-0.0.14.tar.gz (1.2MB) [1G Collecting pyquery (from requests-html->-r /tmp/build/requirements.txt (line 10)) [1G Downloading pyquery-1.4.0-py2.py3-none-any.whl [1G Collecting parse (from requests-html->-r /tmp/build/requirements.txt (line 10)) [1G Downloading parse-1.8.2.tar.gz [1G Collecting idna<2.7,>=2.5 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading idna-2.6-py2.py3-none-any.whl (56kB) [1G Collecting certifi>=2017.4.17 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading certifi-2018.1.18-py2.py3-none-any.whl (151kB) [1G Collecting chardet<3.1.0,>=3.0.2 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading chardet-3.0.4-py2.py3-none-any.whl (133kB) [1G Collecting urllib3<1.23,>=1.21.1 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading urllib3-1.22-py2.py3-none-any.whl (132kB) [1G Collecting six>=1.4.1 (from w3lib->requests-html->-r /tmp/build/requirements.txt (line 10)) [1G Downloading six-1.11.0-py2.py3-none-any.whl [1G Collecting pyee (from pyppeteer->requests-html->-r /tmp/build/requirements.txt (line 10)) [1G Downloading pyee-5.0.0-py2.py3-none-any.whl [1G Collecting websockets (from pyppeteer->requests-html->-r /tmp/build/requirements.txt (line 10)) [1G Downloading websockets-4.0.1-cp36-cp36m-manylinux1_x86_64.whl (81kB) [1G Installing collected packages: dumptruck, idna, certifi, chardet, urllib3, requests, scraperwiki, lxml, cssselect, fake-useragent, beautifulsoup4, bs4, six, w3lib, pyee, websockets, pyppeteer, pyquery, parse, requests-html [1G Running setup.py install for dumptruck: started [1G Running setup.py install for dumptruck: finished with status 'done' [1G Running setup.py develop for scraperwiki [1G Running setup.py install for lxml: started [1G Running setup.py install for lxml: still running... [1G Running setup.py install for lxml: finished with status 'done' [1G Running setup.py install for cssselect: started [1G Running setup.py install for cssselect: finished with status 'done' [1G Running setup.py install for fake-useragent: started [1G Running setup.py install for fake-useragent: finished with status 'done' [1G Running setup.py install for bs4: started [1G Running setup.py install for bs4: finished with status 'done' [1G Running setup.py install for pyppeteer: started [1G Running setup.py install for pyppeteer: finished with status 'done' [1G Running setup.py install for parse: started [1G Running setup.py install for parse: finished with status 'done' [1G Successfully installed beautifulsoup4-4.6.0 bs4-0.0.1 certifi-2018.1.18 chardet-3.0.4 cssselect-0.9.1 dumptruck-0.1.6 fake-useragent-0.1.10 idna-2.6 lxml-3.4.4 parse-1.8.2 pyee-5.0.0 pyppeteer-0.0.14 pyquery-1.4.0 requests-2.18.4 requests-html-0.8.2 scraperwiki six-1.11.0 urllib3-1.22 w3lib-1.19.0 websockets-4.0.1 [1G [1G [1G-----> Discovering process types [1G Procfile declares types -> scraper Injecting scraper and running... Traceback (most recent call last): File "scraper.py", line 4, in <module> from requests_html import HTMLSession File "/app/.heroku/python/lib/python3.6/site-packages/requests_html.py", line 19, in <module> from lxml.html.soupparser import fromstring as soup_parse File "/app/.heroku/python/lib/python3.6/site-packages/lxml/html/soupparser.py", line 7, in <module> from BeautifulSoup import \ ModuleNotFoundError: No module named 'BeautifulSoup'

Statistics

Total run time: 2 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 21.8 KB

History

  • Manually ran revision e39464ae and failed .
  • Created on morph.io

Scraper code

job_statistic