Contributors blablupcom woodbine

Last run failed with status code 1.

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-2.7.14 -----> Installing pip -----> Installing requirements with pip  DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 1))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to revision morph_defaults) to /app/.heroku/src/scraperwiki  Running command git clone -q http://github.com/openaustralia/scraperwiki-python.git /app/.heroku/src/scraperwiki  Running command git checkout -b morph_defaults --track origin/morph_defaults  Switched to a new branch 'morph_defaults'  Branch morph_defaults set up to track remote branch morph_defaults from origin.  Collecting lxml==3.4.4  Downloading lxml-3.4.4.tar.gz (3.5 MB)  Collecting cssselect==0.9.1  Downloading cssselect-0.9.1.tar.gz (32 kB)  Collecting beautifulsoup4  Downloading beautifulsoup4-4.8.2-py2-none-any.whl (106 kB)  Collecting dumptruck>=0.1.2  Downloading dumptruck-0.1.6.tar.gz (15 kB)  Collecting requests  Downloading requests-2.23.0-py2.py3-none-any.whl (58 kB)  Collecting soupsieve>=1.2  Downloading soupsieve-1.9.5-py2.py3-none-any.whl (33 kB)  Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1  Downloading urllib3-1.25.8-py2.py3-none-any.whl (125 kB)  Collecting certifi>=2017.4.17  Downloading certifi-2019.11.28-py2.py3-none-any.whl (156 kB)  Collecting chardet<4,>=3.0.2  Downloading chardet-3.0.4-py2.py3-none-any.whl (133 kB)  Collecting idna<3,>=2.5  Downloading idna-2.9-py2.py3-none-any.whl (58 kB)  Collecting backports.functools-lru-cache; python_version < "3"  Downloading backports.functools_lru_cache-1.6.1-py2.py3-none-any.whl (5.7 kB)  Building wheels for collected packages: lxml, cssselect, dumptruck  Building wheel for lxml (setup.py): started  Building wheel for lxml (setup.py): still running...  Building wheel for lxml (setup.py): finished with status 'done'  Created wheel for lxml: filename=lxml-3.4.4-cp27-cp27mu-linux_x86_64.whl size=2987187 sha256=4e4d410bcd38000465286d37506659dd53fbb17e2a23ae65e7e53953190da4ea  Stored in directory: /tmp/pip-ephem-wheel-cache-VfnD5L/wheels/d6/de/81/11ae6edd05c75aac677e67dd154c85da758ba6f3e8e80e962e  Building wheel for cssselect (setup.py): started  Building wheel for cssselect (setup.py): finished with status 'done'  Created wheel for cssselect: filename=cssselect-0.9.1-py2-none-any.whl size=26993 sha256=3e3a8da81ff2292fe02e9075a3402d26782a6b8627c1549dc9511b8c3b9b43cf  Stored in directory: /tmp/pip-ephem-wheel-cache-VfnD5L/wheels/85/fe/00/b94036d8583cec9791d8cda24c184f2d2ac1397822f7f0e8d4  Building wheel for dumptruck (setup.py): started  Building wheel for dumptruck (setup.py): finished with status 'done'  Created wheel for dumptruck: filename=dumptruck-0.1.6-py2-none-any.whl size=11842 sha256=be77c9f8a078b751c1965daa95b5834aa9646e6acc4b02427965689c6a0c95aa  Stored in directory: /tmp/pip-ephem-wheel-cache-VfnD5L/wheels/dc/75/e9/1e61c4080c73e7bda99614549591f83b53bcc2d682f26fce62  Successfully built lxml cssselect dumptruck  Installing collected packages: dumptruck, urllib3, certifi, chardet, idna, requests, scraperwiki, lxml, cssselect, backports.functools-lru-cache, soupsieve, beautifulsoup4  Running setup.py develop for scraperwiki  Successfully installed backports.functools-lru-cache-1.6.1 beautifulsoup4-4.8.2 certifi-2019.11.28 chardet-3.0.4 cssselect-0.9.1 dumptruck-0.1.6 idna-2.9 lxml-3.4.4 requests-2.23.0 scraperwiki soupsieve-1.9.5 urllib3-1.25.8 DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support    -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... Traceback (most recent call last): File "scraper.py", line 94, in <module> html = urllib2.urlopen(url) File "/app/.heroku/python/lib/python2.7/urllib2.py", line 154, in urlopen return opener.open(url, data, timeout) File "/app/.heroku/python/lib/python2.7/urllib2.py", line 435, in open response = meth(req, response) File "/app/.heroku/python/lib/python2.7/urllib2.py", line 548, in http_response 'http', request, response, code, msg, hdrs) File "/app/.heroku/python/lib/python2.7/urllib2.py", line 467, in error result = self._call_chain(*args) File "/app/.heroku/python/lib/python2.7/urllib2.py", line 407, in _call_chain result = func(*args) File "/app/.heroku/python/lib/python2.7/urllib2.py", line 654, in http_error_302 return self.parent.open(new, timeout=req.timeout) File "/app/.heroku/python/lib/python2.7/urllib2.py", line 435, in open response = meth(req, response) File "/app/.heroku/python/lib/python2.7/urllib2.py", line 548, in http_response 'http', request, response, code, msg, hdrs) File "/app/.heroku/python/lib/python2.7/urllib2.py", line 467, in error result = self._call_chain(*args) File "/app/.heroku/python/lib/python2.7/urllib2.py", line 407, in _call_chain result = func(*args) File "/app/.heroku/python/lib/python2.7/urllib2.py", line 654, in http_error_302 return self.parent.open(new, timeout=req.timeout) File "/app/.heroku/python/lib/python2.7/urllib2.py", line 435, in open response = meth(req, response) File "/app/.heroku/python/lib/python2.7/urllib2.py", line 548, in http_response 'http', request, response, code, msg, hdrs) File "/app/.heroku/python/lib/python2.7/urllib2.py", line 473, in error return self._call_chain(*args) File "/app/.heroku/python/lib/python2.7/urllib2.py", line 407, in _call_chain result = func(*args) File "/app/.heroku/python/lib/python2.7/urllib2.py", line 556, in http_error_default raise HTTPError(req.get_full_url(), code, msg, hdrs, fp) urllib2.HTTPError: HTTP Error 404: Not Found

Data

Downloaded 816 times by SimKennedy MikeRalphson woodbine

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (28 KB) Use the API

rows 10 / 118

l d f
2019-02-21 14:43:02.294946
NHTRX5NFT_GWASNT_gov_2012_01
2019-02-21 14:43:02.841367
NHTRX5NFT_GWASNT_gov_2012_02
2019-02-21 14:43:03.421929
NHTRX5NFT_GWASNT_gov_2012_03
2019-02-21 14:43:03.928843
NHTRX5NFT_GWASNT_gov_2012_04
2019-02-21 14:43:04.455639
NHTRX5NFT_GWASNT_gov_2012_05
2019-02-21 14:43:05.007979
NHTRX5NFT_GWASNT_gov_2012_06
2019-02-21 14:43:05.511090
NHTRX5NFT_GWASNT_gov_2012_07
2019-02-21 14:43:06.004136
NHTRX5NFT_GWASNT_gov_2012_08
2019-02-21 14:43:06.552238
NHTRX5NFT_GWASNT_gov_2012_09
2019-02-21 14:43:07.090166
NHTRX5NFT_GWASNT_gov_2012_10

Statistics

Average successful run time: 4 minutes

Total run time: 2 days

Total cpu time used: 17 minutes

Total disk space used: 57.2 KB

History

  • Auto ran revision 6fa7e410 and failed .
    nothing changed in the database
  • Auto ran revision 6fa7e410 and failed .
    nothing changed in the database
  • Auto ran revision 6fa7e410 and failed .
    nothing changed in the database
  • Auto ran revision 6fa7e410 and failed .
    nothing changed in the database
  • Auto ran revision 6fa7e410 and failed .
    nothing changed in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

Python

sp_NHTRX5NFT_GWASNT_gov / scraper.py