woodbine / sp_NFTRTP_SASHNFT_gov

Scrapes www.surreyandsussex.nhs.uk

Surrey and Sussex Healthcare NHS Trust provides services for East Surrey Hospital and other sites throughout Surrey and Sussex


Contributors blablupcom woodbine

Last run failed with status code 1.

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-2.7.14 -----> Installing pip -----> Installing requirements with pip  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 1))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to revision morph_defaults) to /app/.heroku/src/scraperwiki  Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 2))  Downloading https://files.pythonhosted.org/packages/63/c7/4f2a2a4ad6c6fa99b14be6b3c1cece9142e2d915aa7c43c908677afc8fa4/lxml-3.4.4.tar.gz (3.5MB)  Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 3))  Downloading https://files.pythonhosted.org/packages/aa/e5/9ee1460d485b94a6d55732eb7ad5b6c084caf73dd6f9cb0bb7d2a78fafe8/cssselect-0.9.1.tar.gz  Collecting beautifulsoup4 (from -r /tmp/build/requirements.txt (line 4))  Downloading https://files.pythonhosted.org/packages/a6/29/bcbd41a916ad3faf517780a0af7d0254e8d6722ff6414723eedba4334531/beautifulsoup4-4.6.0-py2-none-any.whl (86kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/15/27/3330a343de80d6849545b6c7723f8c9a08b4b104de964ac366e7e6b318df/dumptruck-0.1.6.tar.gz  Collecting requests (from scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/49/df/50aa1999ab9bde74656c2919d9c0c085fd2b3775fd3eca826012bef76d8c/requests-2.18.4-py2.py3-none-any.whl (88kB)  Collecting idna<2.7,>=2.5 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/27/cc/6dd9a3869f15c2edfab863b992838277279ce92663d334df9ecf5106f5c6/idna-2.6-py2.py3-none-any.whl (56kB)  Collecting urllib3<1.23,>=1.21.1 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/63/cb/6965947c13a94236f6d4b8223e21beb4d576dc72e8130bd7880f600839b8/urllib3-1.22-py2.py3-none-any.whl (132kB)  Collecting certifi>=2017.4.17 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/7c/e6/92ad559b7192d846975fc916b65f667c7b8c3a32bea7372340bfe9a15fa5/certifi-2018.4.16-py2.py3-none-any.whl (150kB)  Collecting chardet<3.1.0,>=3.0.2 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133kB)  Installing collected packages: dumptruck, idna, urllib3, certifi, chardet, requests, scraperwiki, lxml, cssselect, beautifulsoup4  Running setup.py install for dumptruck: started  Running setup.py install for dumptruck: finished with status 'done'  Running setup.py develop for scraperwiki  Running setup.py install for lxml: started  Running setup.py install for lxml: still running...  Running setup.py install for lxml: finished with status 'done'  Running setup.py install for cssselect: started  Running setup.py install for cssselect: finished with status 'done'  Successfully installed beautifulsoup4-4.6.0 certifi-2018.4.16 chardet-3.0.4 cssselect-0.9.1 dumptruck-0.1.6 idna-2.6 lxml-3.4.4 requests-2.18.4 scraperwiki urllib3-1.22   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... NFTRTP_SASHNFT_gov_2018_01 NFTRTP_SASHNFT_gov_2017_12 NFTRTP_SASHNFT_gov_2017_11 NFTRTP_SASHNFT_gov_2017_10 NFTRTP_SASHNFT_gov_2017_09 NFTRTP_SASHNFT_gov_2017_08 NFTRTP_SASHNFT_gov_2017_07 NFTRTP_SASHNFT_gov_2017_06 NFTRTP_SASHNFT_gov_2017_05 NFTRTP_SASHNFT_gov_2017_04 NFTRTP_SASHNFT_gov_2017_03 NFTRTP_SASHNFT_gov_2017_02 NFTRTP_SASHNFT_gov_2017_01 NFTRTP_SASHNFT_gov_2016_12 NFTRTP_SASHNFT_gov_2016_11 NFTRTP_SASHNFT_gov_2016_10 NFTRTP_SASHNFT_gov_2016_09 NFTRTP_SASHNFT_gov_2016_08 NFTRTP_SASHNFT_gov_2016_07 NFTRTP_SASHNFT_gov_2016_06 NFTRTP_SASHNFT_gov_2016_05 NFTRTP_SASHNFT_gov_2016_04 NFTRTP_SASHNFT_gov_2016_03 NFTRTP_SASHNFT_gov_2016_02 NFTRTP_SASHNFT_gov_2016_01 NFTRTP_SASHNFT_gov_2015_12 NFTRTP_SASHNFT_gov_2015_11 NFTRTP_SASHNFT_gov_2015_10 NFTRTP_SASHNFT_gov_2015_09 NFTRTP_SASHNFT_gov_2015_08 NFTRTP_SASHNFT_gov_2015_07 NFTRTP_SASHNFT_gov_2015_06 NFTRTP_SASHNFT_gov_2015_05 NFTRTP_SASHNFT_gov_2015_04 NFTRTP_SASHNFT_gov_2015_03 NFTRTP_SASHNFT_gov_2015_02 NFTRTP_SASHNFT_gov_2015_01 NFTRTP_SASHNFT_gov_2014_12 NFTRTP_SASHNFT_gov_2014_11 NFTRTP_SASHNFT_gov_2014_10 NFTRTP_SASHNFT_gov_2014_09 NFTRTP_SASHNFT_gov_2014_08 NFTRTP_SASHNFT_gov_2014_07 NFTRTP_SASHNFT_gov_2014_06 NFTRTP_SASHNFT_gov_2014_05 NFTRTP_SASHNFT_gov_2014_04 NFTRTP_SASHNFT_gov_2014_03 NFTRTP_SASHNFT_gov_2014_02 NFTRTP_SASHNFT_gov_2014_01 NFTRTP_SASHNFT_gov_2013_12 NFTRTP_SASHNFT_gov_2013_11 NFTRTP_SASHNFT_gov_2013_10 NFTRTP_SASHNFT_gov_2013_09 NFTRTP_SASHNFT_gov_2013_08 NFTRTP_SASHNFT_gov_2013_07 NFTRTP_SASHNFT_gov_2013_06 NFTRTP_SASHNFT_gov_2013_05 NFTRTP_SASHNFT_gov_2013_04 NFTRTP_SASHNFT_gov_2013_03 NFTRTP_SASHNFT_gov_2013_02 NFTRTP_SASHNFT_gov_2013_01 NFTRTP_SASHNFT_gov_2012_12 NFTRTP_SASHNFT_gov_2012_11 NFTRTP_SASHNFT_gov_2012_09 NFTRTP_SASHNFT_gov_2012_08 NFTRTP_SASHNFT_gov_2012_07 NFTRTP_SASHNFT_gov_2012_06 NFTRTP_SASHNFT_gov_2012_05 NFTRTP_SASHNFT_gov_2012_04 NFTRTP_SASHNFT_gov_2012_03 NFTRTP_SASHNFT_gov_2012_02 NFTRTP_SASHNFT_gov_2012_01 NFTRTP_SASHNFT_gov_2011_12 NFTRTP_SASHNFT_gov_2011_11 NFTRTP_SASHNFT_gov_2011_10 NFTRTP_SASHNFT_gov_2011_09 NFTRTP_SASHNFT_gov_2011_08 NFTRTP_SASHNFT_gov_2011_07 NFTRTP_SASHNFT_gov_2011_06 NFTRTP_SASHNFT_gov_2011_05 NFTRTP_SASHNFT_gov_2011_04 NFTRTP_SASHNFT_gov_2011_03 NFTRTP_SASHNFT_gov_2011_02 Error validating URL. Traceback (most recent call last): File "scraper.py", line 140, in <module> NFTRTP_SASHNFT_gov_2011_01 *Error: Invalid URL* https://www.surreyandsussex.nhs.uk/wp-content/uploads/2013/05/AP-and-GL-Expenditure-January-2011.xls raise Exception("%d errors occurred during scrape." % errors) Exception: 1 errors occurred during scrape.

Data

Downloaded 500 times by SimKennedy MikeRalphson woodbine

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (51 KB) Use the API

rows 10 / 155

l d f
2017-03-27 16:49:11.886963
NFTRTP_SASHNFT_gov_2017_02
2017-03-27 16:49:12.477117
NFTRTP_SASHNFT_gov_2017_01
2017-03-27 16:49:12.919548
NFTRTP_SASHNFT_gov_2016_12
2017-03-27 16:49:14.261381
NFTRTP_SASHNFT_gov_2016_09
2017-03-27 16:49:14.852903
NFTRTP_SASHNFT_gov_2016_08
2017-03-27 16:49:15.445766
NFTRTP_SASHNFT_gov_2016_07
2017-03-27 16:49:15.893700
NFTRTP_SASHNFT_gov_2016_06
2017-03-27 16:49:16.336999
NFTRTP_SASHNFT_gov_2016_05
2017-03-27 16:49:16.928904
NFTRTP_SASHNFT_gov_2016_04
2017-03-27 16:49:17.524670
NFTRTP_SASHNFT_gov_2016_03

Statistics

Average successful run time: 19 minutes

Total run time: 17 days

Total cpu time used: 18 minutes

Total disk space used: 78.6 KB

History

  • Auto ran revision 024a9ee9 and failed .
    83 records added, 83 records removed in the database
    86 pages scraped
  • Auto ran revision 024a9ee9 and completed successfully .
    84 records added, 84 records removed in the database
    86 pages scraped
  • Auto ran revision 024a9ee9 and completed successfully .
    84 records added, 84 records removed in the database
  • Auto ran revision 024a9ee9 and completed successfully .
    84 records added, 84 records removed in the database
    86 pages scraped
  • Auto ran revision 024a9ee9 and completed successfully .
    84 records added, 84 records removed in the database
    86 pages scraped
  • ...
  • Created on morph.io

Show complete history

Scraper code

Python

sp_NFTRTP_SASHNFT_gov / scraper.py