Contributors blablupcom woodbine

Last run failed with status code 1.

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-2.7.14 -----> Installing pip -----> Installing requirements with pip  DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. pip 21.0 will drop support for Python 2.7 in January 2021. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 1))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to revision morph_defaults) to /app/.heroku/src/scraperwiki  Running command git clone -q http://github.com/openaustralia/scraperwiki-python.git /app/.heroku/src/scraperwiki  Running command git checkout -b morph_defaults --track origin/morph_defaults  Switched to a new branch 'morph_defaults'  Branch morph_defaults set up to track remote branch morph_defaults from origin.  Collecting lxml==3.4.4  Downloading lxml-3.4.4.tar.gz (3.5 MB)  Collecting cssselect==0.9.1  Downloading cssselect-0.9.1.tar.gz (32 kB)  Collecting beautifulsoup4  Downloading beautifulsoup4-4.9.1-py2-none-any.whl (111 kB)  Collecting dumptruck>=0.1.2  Downloading dumptruck-0.1.6.tar.gz (15 kB)  Collecting requests  Downloading requests-2.23.0-py2.py3-none-any.whl (58 kB)  Collecting soupsieve<2.0  Downloading soupsieve-1.9.6-py2.py3-none-any.whl (33 kB)  Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1  Downloading urllib3-1.25.9-py2.py3-none-any.whl (126 kB)  Collecting certifi>=2017.4.17  Downloading certifi-2020.4.5.1-py2.py3-none-any.whl (157 kB)  Collecting chardet<4,>=3.0.2  Downloading chardet-3.0.4-py2.py3-none-any.whl (133 kB)  Collecting idna<3,>=2.5  Downloading idna-2.9-py2.py3-none-any.whl (58 kB)  Collecting backports.functools-lru-cache; python_version < "3"  Downloading backports.functools_lru_cache-1.6.1-py2.py3-none-any.whl (5.7 kB)  Building wheels for collected packages: lxml, cssselect, dumptruck  Building wheel for lxml (setup.py): started  Building wheel for lxml (setup.py): still running...  Building wheel for lxml (setup.py): finished with status 'done'  Created wheel for lxml: filename=lxml-3.4.4-cp27-cp27mu-linux_x86_64.whl size=2987152 sha256=f6589143535c17dc764838609080673b3b4c7124384e37889361361de6c21509  Stored in directory: /tmp/pip-ephem-wheel-cache-8dBhtH/wheels/d6/de/81/11ae6edd05c75aac677e67dd154c85da758ba6f3e8e80e962e  Building wheel for cssselect (setup.py): started  Building wheel for cssselect (setup.py): finished with status 'done'  Created wheel for cssselect: filename=cssselect-0.9.1-py2-none-any.whl size=26993 sha256=b63e275d1b37be16b6f546b490c256bb1bef679f604672ebb97a87116c5d36b6  Stored in directory: /tmp/pip-ephem-wheel-cache-8dBhtH/wheels/85/fe/00/b94036d8583cec9791d8cda24c184f2d2ac1397822f7f0e8d4  Building wheel for dumptruck (setup.py): started  Building wheel for dumptruck (setup.py): finished with status 'done'  Created wheel for dumptruck: filename=dumptruck-0.1.6-py2-none-any.whl size=11842 sha256=a84210b58909c7f456612b0cdabee8ea18b9267eb047d54caa4f26aff7106323  Stored in directory: /tmp/pip-ephem-wheel-cache-8dBhtH/wheels/dc/75/e9/1e61c4080c73e7bda99614549591f83b53bcc2d682f26fce62  Successfully built lxml cssselect dumptruck  Installing collected packages: dumptruck, urllib3, certifi, chardet, idna, requests, scraperwiki, lxml, cssselect, backports.functools-lru-cache, soupsieve, beautifulsoup4  Running setup.py develop for scraperwiki  Successfully installed backports.functools-lru-cache-1.6.1 beautifulsoup4-4.9.1 certifi-2020.4.5.1 chardet-3.0.4 cssselect-0.9.1 dumptruck-0.1.6 idna-2.9 lxml-3.4.4 requests-2.23.0 scraperwiki soupsieve-1.9.6 urllib3-1.25.9 DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. pip 21.0 will drop support for Python 2.7 in January 2021. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support    -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... NFTRXW_STHNFT_gov_2020_01 NFTRXW_STHNFT_gov_2020_02 NFTRXW_STHNFT_gov_2020_03 NFTRXW_STHNFT_gov_2020_04 NFTRXW_STHNFT_gov_2019_01 NFTRXW_STHNFT_gov_2019_02 NFTRXW_STHNFT_gov_2019_03 NFTRXW_STHNFT_gov_2019_04 NFTRXW_STHNFT_gov_2019_05 NFTRXW_STHNFT_gov_2019_06 NFTRXW_STHNFT_gov_2019_07 NFTRXW_STHNFT_gov_2019_08 NFTRXW_STHNFT_gov_2019_09 NFTRXW_STHNFT_gov_2019_10 NFTRXW_STHNFT_gov_2019_11 NFTRXW_STHNFT_gov_2019_12 NFTRXW_STHNFT_gov_2018_01 NFTRXW_STHNFT_gov_2018_02 NFTRXW_STHNFT_gov_2018_03 *Error: Invalid filetype* https://www.sath.nhs.uk/wp-content/uploads/2018/04/SHFC-AP19c-Publication-Of-Spend-SATH-March-2018.xlsm NFTRXW_STHNFT_gov_2018_04 NFTRXW_STHNFT_gov_2018_05 NFTRXW_STHNFT_gov_2018_06 NFTRXW_STHNFT_gov_2018_07 *Error: Invalid filetype* https://www.sath.nhs.uk/wp-content/uploads/2018/08/Publication-Of-Spend-SATH-July-18.xlsm NFTRXW_STHNFT_gov_2018_08 NFTRXW_STHNFT_gov_2018_09 NFTRXW_STHNFT_gov_2018_10 NFTRXW_STHNFT_gov_2018_11 NFTRXW_STHNFT_gov_2018_12 NFTRXW_STHNFT_gov_2017_01 NFTRXW_STHNFT_gov_2017_02 NFTRXW_STHNFT_gov_2017_03 NFTRXW_STHNFT_gov_2017_04 NFTRXW_STHNFT_gov_2017_05 NFTRXW_STHNFT_gov_2017_06 NFTRXW_STHNFT_gov_2017_07 NFTRXW_STHNFT_gov_2017_08 NFTRXW_STHNFT_gov_2017_09 NFTRXW_STHNFT_gov_2017_10 NFTRXW_STHNFT_gov_2017_11 NFTRXW_STHNFT_gov_2017_12 NFTRXW_STHNFT_gov_2016_02 NFTRXW_STHNFT_gov_2016_03 NFTRXW_STHNFT_gov_2016_04 NFTRXW_STHNFT_gov_2016_05 NFTRXW_STHNFT_gov_2016_06 NFTRXW_STHNFT_gov_2016_07 NFTRXW_STHNFT_gov_2016_08 NFTRXW_STHNFT_gov_2016_09 NFTRXW_STHNFT_gov_2016_10 NFTRXW_STHNFT_gov_2016_11 NFTRXW_STHNFT_gov_2016_12 Traceback (most recent call last): File "scraper.py", line 144, in <module> raise Exception("%d errors occurred during scrape." % errors) Exception: 2 errors occurred during scrape.

Data

Downloaded 782 times by SimKennedy MikeRalphson woodbine

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (50 KB) Use the API

rows 10 / 153

l d f
2016-11-01 05:17:33.938879
NFTRXW_STHNFT_gov_2015_12
2016-11-01 05:17:38.329684
NFTRXW_STHNFT_gov_2015_11
2016-11-01 05:17:40.097421
NFTRXW_STHNFT_gov_2015_10
2016-11-01 05:17:44.180623
NFTRXW_STHNFT_gov_2015_09
2016-11-01 05:17:48.427168
NFTRXW_STHNFT_gov_2015_08
2016-11-01 05:17:52.678501
NFTRXW_STHNFT_gov_2015_07
2016-11-01 05:17:57.403106
NFTRXW_STHNFT_gov_2015_06
2016-11-01 05:18:01.483707
NFTRXW_STHNFT_gov_2015_05
2016-11-01 05:18:03.442706
NFTRXW_STHNFT_gov_2015_05
2016-11-01 05:18:05.514470
NFTRXW_STHNFT_gov_2015_04

Statistics

Average successful run time: 10 minutes

Total run time: 7 days

Total cpu time used: about 1 hour

Total disk space used: 110 KB

History

  • Auto ran revision ac2fdfa2 and failed .
    49 records added, 49 records removed in the database
  • Auto ran revision ac2fdfa2 and failed .
    49 records added, 49 records removed in the database
  • Auto ran revision ac2fdfa2 and failed .
    49 records added, 49 records removed in the database
  • Auto ran revision ac2fdfa2 and failed .
    49 records added, 49 records removed in the database
  • Auto ran revision ac2fdfa2 and failed .
    49 records added, 49 records removed in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

Python

sp_NFTRXW_STHNFT_gov / scraper.py