JHonvadoe / job_statistic

job_statistic_data


Contributors JHonvadoe

Last run failed with status code 1.

Console output of last run

Injecting configuration and compiling...  -----> Python app detected  ! The latest version of Python 3 is python-3.6.2 (you are using python-3.6.4, which is unsupported).  ! We recommend upgrading by specifying the latest version (python-3.6.2).  Learn More: https://devcenter.heroku.com/articles/python-runtimes -----> Installing python-3.6.4 -----> Installing pip -----> Installing requirements with pip  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to morph_defaults) to /app/.heroku/src/scraperwiki  Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 8))  Downloading lxml-3.4.4.tar.gz (3.5MB)  Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 9))  Downloading cssselect-0.9.1.tar.gz  Collecting requests-html (from -r /tmp/build/requirements.txt (line 10))  Downloading requests_html-0.8.2-py2.py3-none-any.whl  Collecting beautifulsoup4 (from -r /tmp/build/requirements.txt (line 11))  Downloading beautifulsoup4-4.6.0-py3-none-any.whl (86kB)  Collecting fake-useragent (from -r /tmp/build/requirements.txt (line 12))  Downloading fake-useragent-0.1.10.tar.gz  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading dumptruck-0.1.6.tar.gz  Collecting requests (from scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading requests-2.18.4-py2.py3-none-any.whl (88kB)  Collecting bs4 (from requests-html->-r /tmp/build/requirements.txt (line 10))  Downloading bs4-0.0.1.tar.gz  Collecting w3lib (from requests-html->-r /tmp/build/requirements.txt (line 10))  Downloading w3lib-1.19.0-py2.py3-none-any.whl  Collecting pyppeteer (from requests-html->-r /tmp/build/requirements.txt (line 10))  Downloading pyppeteer-0.0.14.tar.gz (1.2MB)  Collecting pyquery (from requests-html->-r /tmp/build/requirements.txt (line 10))  Downloading pyquery-1.4.0-py2.py3-none-any.whl  Collecting parse (from requests-html->-r /tmp/build/requirements.txt (line 10))  Downloading parse-1.8.2.tar.gz  Collecting idna<2.7,>=2.5 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading idna-2.6-py2.py3-none-any.whl (56kB)  Collecting certifi>=2017.4.17 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading certifi-2018.1.18-py2.py3-none-any.whl (151kB)  Collecting chardet<3.1.0,>=3.0.2 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading chardet-3.0.4-py2.py3-none-any.whl (133kB)  Collecting urllib3<1.23,>=1.21.1 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading urllib3-1.22-py2.py3-none-any.whl (132kB)  Collecting six>=1.4.1 (from w3lib->requests-html->-r /tmp/build/requirements.txt (line 10))  Downloading six-1.11.0-py2.py3-none-any.whl  Collecting pyee (from pyppeteer->requests-html->-r /tmp/build/requirements.txt (line 10))  Downloading pyee-5.0.0-py2.py3-none-any.whl  Collecting websockets (from pyppeteer->requests-html->-r /tmp/build/requirements.txt (line 10))  Downloading websockets-4.0.1-cp36-cp36m-manylinux1_x86_64.whl (81kB)  Installing collected packages: dumptruck, idna, certifi, chardet, urllib3, requests, scraperwiki, lxml, cssselect, fake-useragent, beautifulsoup4, bs4, six, w3lib, pyee, websockets, pyppeteer, pyquery, parse, requests-html  Running setup.py install for dumptruck: started  Running setup.py install for dumptruck: finished with status 'done'  Running setup.py develop for scraperwiki  Running setup.py install for lxml: started  Running setup.py install for lxml: still running...  Running setup.py install for lxml: finished with status 'done'  Running setup.py install for cssselect: started  Running setup.py install for cssselect: finished with status 'done'  Running setup.py install for fake-useragent: started  Running setup.py install for fake-useragent: finished with status 'done'  Running setup.py install for bs4: started  Running setup.py install for bs4: finished with status 'done'  Running setup.py install for pyppeteer: started  Running setup.py install for pyppeteer: finished with status 'done'  Running setup.py install for parse: started  Running setup.py install for parse: finished with status 'done'  Successfully installed beautifulsoup4-4.6.0 bs4-0.0.1 certifi-2018.1.18 chardet-3.0.4 cssselect-0.9.1 dumptruck-0.1.6 fake-useragent-0.1.10 idna-2.6 lxml-3.4.4 parse-1.8.2 pyee-5.0.0 pyppeteer-0.0.14 pyquery-1.4.0 requests-2.18.4 requests-html-0.8.2 scraperwiki six-1.11.0 urllib3-1.22 w3lib-1.19.0 websockets-4.0.1   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... Traceback (most recent call last): File "scraper.py", line 4, in <module> from requests_html import HTMLSession File "/app/.heroku/python/lib/python3.6/site-packages/requests_html.py", line 19, in <module> from lxml.html.soupparser import fromstring as soup_parse File "/app/.heroku/python/lib/python3.6/site-packages/lxml/html/soupparser.py", line 7, in <module> from BeautifulSoup import \ ModuleNotFoundError: No module named 'BeautifulSoup'

Statistics

Total run time: 2 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 21.8 KB

History

  • Manually ran revision e39464ae and failed .
  • Created on morph.io

Scraper code

job_statistic