Contributors blablupcom woodbine

Last run failed with status code 1.

Console output of last run

Injecting configuration and compiling...  -----> Python app detected  ! The latest version of Python 2 is python-2.7.14 (you are using python-2.7.6, which is unsupported).  ! We recommend upgrading by specifying the latest version (python-2.7.14).  Learn More: -----> Installing python-2.7.6 -----> Installing pip -----> Installing requirements with pip  Obtaining scraperwiki from git+ (from -r /tmp/build/requirements.txt (line 1))  Cloning (to morph_defaults) to /app/.heroku/src/scraperwiki  Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 2))  /app/.heroku/python/lib/python2.7/site-packages/pip/_vendor/requests/packages/urllib3/util/ SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see  SNIMissingWarning  /app/.heroku/python/lib/python2.7/site-packages/pip/_vendor/requests/packages/urllib3/util/ InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see  InsecurePlatformWarning  Downloading lxml-3.4.4.tar.gz (3.5MB)  Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 3))  Downloading cssselect-0.9.1.tar.gz  Collecting beautifulsoup4 (from -r /tmp/build/requirements.txt (line 4))  Downloading beautifulsoup4-4.6.0-py2-none-any.whl (86kB)  Collecting python-dateutil (from -r /tmp/build/requirements.txt (line 5))  Downloading python_dateutil-2.6.1-py2.py3-none-any.whl (194kB)  Collecting requests[security] (from -r /tmp/build/requirements.txt (line 6))  Downloading requests-2.18.4-py2.py3-none-any.whl (88kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading dumptruck-0.1.6.tar.gz  Collecting six>=1.5 (from python-dateutil->-r /tmp/build/requirements.txt (line 5))  Downloading six-1.11.0-py2.py3-none-any.whl  Collecting chardet<3.1.0,>=3.0.2 (from requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading chardet-3.0.4-py2.py3-none-any.whl (133kB)  Collecting certifi>=2017.4.17 (from requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading certifi-2018.1.18-py2.py3-none-any.whl (151kB)  Collecting urllib3<1.23,>=1.21.1 (from requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading urllib3-1.22-py2.py3-none-any.whl (132kB)  Collecting idna<2.7,>=2.5 (from requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading idna-2.6-py2.py3-none-any.whl (56kB)  Collecting cryptography>=1.3.4; extra == "security" (from requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading cryptography-2.1.4-cp27-cp27m-manylinux1_x86_64.whl (2.2MB)  Collecting pyOpenSSL>=0.14; extra == "security" (from requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading pyOpenSSL-17.5.0-py2.py3-none-any.whl (53kB)  Collecting cffi>=1.7; platform_python_implementation != "PyPy" (from cryptography>=1.3.4; extra == "security"->requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading cffi-1.11.4-cp27-cp27m-manylinux1_x86_64.whl (407kB)  Collecting enum34; python_version < "3" (from cryptography>=1.3.4; extra == "security"->requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading enum34-1.1.6-py2-none-any.whl  Collecting ipaddress; python_version < "3" (from cryptography>=1.3.4; extra == "security"->requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading ipaddress-1.0.19.tar.gz  Collecting asn1crypto>=0.21.0 (from cryptography>=1.3.4; extra == "security"->requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading asn1crypto-0.24.0-py2.py3-none-any.whl (101kB)  Collecting pycparser (from cffi>=1.7; platform_python_implementation != "PyPy"->cryptography>=1.3.4; extra == "security"->requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading pycparser-2.18.tar.gz (245kB)  Installing collected packages: dumptruck, chardet, certifi, urllib3, idna, pycparser, cffi, enum34, six, ipaddress, asn1crypto, cryptography, pyOpenSSL, requests, scraperwiki, lxml, cssselect, beautifulsoup4, python-dateutil  Running install for dumptruck: started  Running install for dumptruck: finished with status 'done'  Running install for pycparser: started  Running install for pycparser: finished with status 'done'  Running install for ipaddress: started  Running install for ipaddress: finished with status 'done'  Running develop for scraperwiki  Running install for lxml: started  Running install for lxml: still running...  Running install for lxml: finished with status 'done'  Running install for cssselect: started  Running install for cssselect: finished with status 'done'  Successfully installed asn1crypto-0.24.0 beautifulsoup4-4.6.0 certifi-2018.1.18 cffi-1.11.4 chardet-3.0.4 cryptography-2.1.4 cssselect-0.9.1 dumptruck-0.1.6 enum34-1.1.6 idna-2.6 ipaddress-1.0.19 lxml-3.4.4 pyOpenSSL-17.5.0 pycparser-2.18 python-dateutil-2.6.1 requests-2.18.4 scraperwiki six-1.11.0 urllib3-1.22   ! Hello! It looks like your application is using an outdated version of Python.  ! This caused the security warning you saw above during the 'pip install' step.  ! We recommend 'python-3.6.2', which you can specify in a 'runtime.txt' file.  ! -- Much Love, Heroku.   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... Error validating URL. DOH033_NIFHA_gov_2017_11 *Error: Invalid URL* Error validating URL. DOH033_NIFHA_gov_2017_10 *Error: Invalid URL* Error validating URL. DOH033_NIFHA_gov_2017_09 *Error: Invalid URL* Error validating URL. DOH033_NIFHA_gov_2017_08 *Error: Invalid URL* Error validating URL. DOH033_NIFHA_gov_2017_07 *Error: Invalid URL* Error validating URL. DOH033_NIFHA_gov_2017_06 *Error: Invalid URL* Error validating URL. DOH033_NIFHA_gov_2017_05 *Error: Invalid URL* Error validating URL. DOH033_NIFHA_gov_2017_04 *Error: Invalid URL* Error validating URL. DOH033_NIFHA_gov_2017_03 *Error: Invalid URL* DOH033_NIFHA_gov_2017_02 DOH033_NIFHA_gov_2017_01 DOH033_NIFHA_gov_2016_12 DOH033_NIFHA_gov_2016_11 DOH033_NIFHA_gov_2016_10 DOH033_NIFHA_gov_2016_09 DOH033_NIFHA_gov_2016_08 DOH033_NIFHA_gov_2016_07 DOH033_NIFHA_gov_2016_06 DOH033_NIFHA_gov_2016_05 DOH033_NIFHA_gov_2016_04 Error validating URL. DOH033_NIFHA_gov_2016_03 *Error: Invalid URL* Error validating URL. DOH033_NIFHA_gov_2016_02 *Error: Invalid URL* Error validating URL. DOH033_NIFHA_gov_2016_01 *Error: Invalid URL* Error validating URL. DOH033_NIFHA_gov_2015_12 *Error: Invalid URL* Error validating URL. DOH033_NIFHA_gov_2015_11 *Error: Invalid URL* Error validating URL. DOH033_NIFHA_gov_2015_10 *Error: Invalid URL* DOH033_NIFHA_gov_2015_09 DOH033_NIFHA_gov_2015_08 DOH033_NIFHA_gov_2015_07 DOH033_NIFHA_gov_2015_06 DOH033_NIFHA_gov_2015_05 DOH033_NIFHA_gov_2015_04 DOH033_NIFHA_gov_2015_03 DOH033_NIFHA_gov_2015_02 DOH033_NIFHA_gov_2015_01 DOH033_NIFHA_gov_2014_12 DOH033_NIFHA_gov_2014_11 DOH033_NIFHA_gov_2014_10 DOH033_NIFHA_gov_2014_09 DOH033_NIFHA_gov_2014_08 DOH033_NIFHA_gov_2014_07 DOH033_NIFHA_gov_2014_06 DOH033_NIFHA_gov_2014_05 DOH033_NIFHA_gov_2014_04 DOH033_NIFHA_gov_2014_03 DOH033_NIFHA_gov_2014_02 DOH033_NIFHA_gov_2014_01 DOH033_NIFHA_gov_2013_12 DOH033_NIFHA_gov_2013_11 DOH033_NIFHA_gov_2013_10 DOH033_NIFHA_gov_2013_09 DOH033_NIFHA_gov_2013_08 DOH033_NIFHA_gov_2013_07 DOH033_NIFHA_gov_2013_06 DOH033_NIFHA_gov_2013_05 DOH033_NIFHA_gov_2013_04 DOH033_NIFHA_gov_2013_03 DOH033_NIFHA_gov_2013_02 DOH033_NIFHA_gov_2013_01 DOH033_NIFHA_gov_2012_12 DOH033_NIFHA_gov_2012_11 DOH033_NIFHA_gov_2012_10 DOH033_NIFHA_gov_2012_09 DOH033_NIFHA_gov_2012_08 DOH033_NIFHA_gov_2012_07 DOH033_NIFHA_gov_2012_06 DOH033_NIFHA_gov_2012_05 DOH033_NIFHA_gov_2012_04 Traceback (most recent call last): File "", line 130, in <module> raise Exception("%d errors occurred during scrape." % errors) Exception: 15 errors occurred during scrape.


Downloaded 512 times by SimKennedy MikeRalphson woodbine

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (31 KB) Use the API

rows 10 / 55

d l f
2015-09-20 03:00:03.559731 Year 2015-2016/Transparency-of-Spend-over-25k-April-2015.csv
2015-11-08 23:32:18.885911 Year 2015-2016/transparency-of-spend-over-25k-october-2015.xls
2018-01-22 05:49:39.695518
2018-01-22 05:49:41.685872
2018-01-22 05:49:42.910205
2018-01-22 05:49:44.283562
2018-01-22 05:49:48.065179
2018-01-22 05:49:49.702324
2018-01-22 05:49:51.081795
2018-01-22 05:49:52.458977


Average successful run time: 3 minutes

Total run time: about 1 month

Total cpu time used: 44 minutes

Total disk space used: 58.2 KB


  • Auto ran revision c50f7c47 and failed .
    53 records added, 53 records removed in the database
  • Auto ran revision c50f7c47 and failed .
    53 records added, 53 records removed in the database
    54 pages scraped
  • Auto ran revision c50f7c47 and failed .
    53 records added, 53 records removed in the database
  • Auto ran revision c50f7c47 and failed .
    53 records added, 53 records removed in the database
  • Auto ran revision c50f7c47 and failed .
    nothing changed in the database
  • ...
  • Created on

Show complete history

Scraper code


sp_DOH033_NIFHA_gov /