woodbine / sp_E0101_BNESC_gov

spending

Scrapes www.bathnes.gov.uk

Bathnes | Making Bath and North East Somerset an even better place to live, work and visit


This is a scraper that runs on Morph. To get started see the documentation

Contributors blablupcom woodbine

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected  ! The latest version of Python 2 is python-2.7.14 (you are using python-2.7.9, which is unsupported).  ! We recommend upgrading by specifying the latest version (python-2.7.14).  Learn More: https://devcenter.heroku.com/articles/python-runtimes -----> Installing python-2.7.9 -----> Installing pip -----> Installing requirements with pip  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to revision morph_defaults) to /app/.heroku/src/scraperwiki  Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 8))  Downloading https://files.pythonhosted.org/packages/63/c7/4f2a2a4ad6c6fa99b14be6b3c1cece9142e2d915aa7c43c908677afc8fa4/lxml-3.4.4.tar.gz (3.5MB)  Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 9))  Downloading https://files.pythonhosted.org/packages/aa/e5/9ee1460d485b94a6d55732eb7ad5b6c084caf73dd6f9cb0bb7d2a78fafe8/cssselect-0.9.1.tar.gz  Collecting beautifulsoup4 (from -r /tmp/build/requirements.txt (line 10))  Downloading https://files.pythonhosted.org/packages/a6/29/bcbd41a916ad3faf517780a0af7d0254e8d6722ff6414723eedba4334531/beautifulsoup4-4.6.0-py2-none-any.whl (86kB)  Collecting python-dateutil (from -r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/cf/f5/af2b09c957ace60dcfac112b669c45c8c97e32f94aa8b56da4c6d1682825/python_dateutil-2.7.3-py2.py3-none-any.whl (211kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/15/27/3330a343de80d6849545b6c7723f8c9a08b4b104de964ac366e7e6b318df/dumptruck-0.1.6.tar.gz  Collecting requests (from scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/49/df/50aa1999ab9bde74656c2919d9c0c085fd2b3775fd3eca826012bef76d8c/requests-2.18.4-py2.py3-none-any.whl (88kB)  Collecting six>=1.5 (from python-dateutil->-r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/67/4b/141a581104b1f6397bfa78ac9d43d8ad29a7ca43ea90a2d863fe3056e86a/six-1.11.0-py2.py3-none-any.whl  Collecting idna<2.7,>=2.5 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/27/cc/6dd9a3869f15c2edfab863b992838277279ce92663d334df9ecf5106f5c6/idna-2.6-py2.py3-none-any.whl (56kB)  Collecting urllib3<1.23,>=1.21.1 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/63/cb/6965947c13a94236f6d4b8223e21beb4d576dc72e8130bd7880f600839b8/urllib3-1.22-py2.py3-none-any.whl (132kB)  Collecting certifi>=2017.4.17 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/7c/e6/92ad559b7192d846975fc916b65f667c7b8c3a32bea7372340bfe9a15fa5/certifi-2018.4.16-py2.py3-none-any.whl (150kB)  Collecting chardet<3.1.0,>=3.0.2 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133kB)  Installing collected packages: dumptruck, idna, urllib3, certifi, chardet, requests, scraperwiki, lxml, cssselect, beautifulsoup4, six, python-dateutil  Running setup.py install for dumptruck: started  Running setup.py install for dumptruck: finished with status 'done'  Running setup.py develop for scraperwiki  Running setup.py install for lxml: started  Running setup.py install for lxml: still running...  Running setup.py install for lxml: finished with status 'done'  Running setup.py install for cssselect: started  Running setup.py install for cssselect: finished with status 'done'  Successfully installed beautifulsoup4-4.6.0 certifi-2018.4.16 chardet-3.0.4 cssselect-0.9.1 dumptruck-0.1.6 idna-2.6 lxml-3.4.4 python-dateutil-2.7.3 requests-2.18.4 scraperwiki six-1.11.0 urllib3-1.22   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... E0101_BNESC_gov_2018_Q1 E0101_BNESC_gov_2017_Q4 E0101_BNESC_gov_2017_Q3 E0101_BNESC_gov_2017_Q2 E0101_BNESC_gov_2017_Q1 E0101_BNESC_gov_2016_Q4 E0101_BNESC_gov_2016_Q3 E0101_BNESC_gov_2016_Q2 E0101_BNESC_gov_2016_Q1 E0101_BNESC_gov_2015_Q4 E0101_BNESC_gov_2015_Q3 E0101_BNESC_gov_2015_Q2 E0101_BNESC_gov_2015_Q1 E0101_BNESC_gov_2014_Q4 E0101_BNESC_gov_2014_Q3 E0101_BNESC_gov_2014_Q2 E0101_BNESC_gov_2014_Q1 E0101_BNESC_gov_2013_Q4 E0101_BNESC_gov_2013_Q3 E0101_BNESC_gov_2013_Q2 E0101_BNESC_gov_2013_Q1 E0101_BNESC_gov_2012_Q4 E0101_BNESC_gov_2012_Q3 E0101_BNESC_gov_2012_Q2 E0101_BNESC_gov_2012_Q1 E0101_BNESC_gov_2011_Q4 E0101_BNESC_gov_2011_Q3 E0101_BNESC_gov_2011_Q2 E0101_BNESC_gov_2011_Q1 E0101_BNESC_gov_2010_12

Data

Downloaded 512 times by SimKennedy MikeRalphson woodbine

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (43 KB) Use the API

rows 10 / 103

f d l
E0101_BNESC_gov_2011_12
2016-01-18 23:30:13.178173
E0101_BNESC_gov_2011_11
2016-01-18 23:30:16.923879
E0101_BNESC_gov_2011_10
2016-01-18 23:30:24.641590
E0101_BNESC_gov_2011_09
2016-01-18 23:30:29.715103
E0101_BNESC_gov_2011_08
2016-01-18 23:30:35.656926
E0101_BNESC_gov_2011_07
2016-01-18 23:30:42.237832
E0101_BNESC_gov_2011_06
2016-01-18 23:30:45.946666
E0101_BNESC_gov_2011_05
2016-01-18 23:30:52.320534
E0101_BNESC_gov_2011_04
2016-01-18 23:30:56.387208
E0101_BNESC_gov_2011_03
2016-01-18 23:30:58.689778

Statistics

Average successful run time: 4 minutes

Total run time: about 1 month

Total cpu time used: 9 minutes

Total disk space used: 70 KB

History

  • Auto ran revision 77600a7b and completed successfully .
    30 records added, 30 records removed in the database
    31 pages scraped
  • Auto ran revision 77600a7b and completed successfully .
    30 records added, 30 records removed in the database
    31 pages scraped
  • Auto ran revision 77600a7b and completed successfully .
    30 records added, 30 records removed in the database
    32 pages scraped
  • Auto ran revision 77600a7b and completed successfully .
    30 records added, 30 records removed in the database
    31 pages scraped
  • Auto ran revision 77600a7b and completed successfully .
    30 records added, 30 records removed in the database
    31 pages scraped
  • ...
  • Created on morph.io

Show complete history

Scraper code

Python

sp_E0101_BNESC_gov / scraper.py