blablupcom / E4505_SCMBC_gov


Sunderland City Council welcomes you

Contributors blablupcom

Last run completed successfully .

Console output of last run

Injecting configuration and compiling... -----> Python app detected -----> Stack changed, re-installing runtime -----> Installing runtime (python-2.7.6) -----> Installing dependencies with pip  Obtaining scraperwiki from git+ (from -r requirements.txt (line 1))  Cloning (to morph_defaults) to ./.heroku/src/scraperwiki  Collecting lxml==3.4.4 (from -r requirements.txt (line 2))  Downloading lxml-3.4.4.tar.gz (3.5MB)  Building lxml version 3.4.4.  Building without Cython.  Using build configuration of libxslt 1.1.28  /app/.heroku/python/lib/python2.7/distutils/ UserWarning: Unknown distribution option: 'bugtrack_url'  warnings.warn(msg)  Collecting cssselect==0.9.1 (from -r requirements.txt (line 3))  Downloading cssselect-0.9.1.tar.gz  Collecting beautifulsoup4 (from -r requirements.txt (line 4))  Downloading beautifulsoup4-4.4.0-py2-none-any.whl (81kB)  Collecting python-dateutil (from -r requirements.txt (line 5))  Downloading python_dateutil-2.4.2-py2.py3-none-any.whl (188kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r requirements.txt (line 1))  Downloading dumptruck-0.1.6.tar.gz  Collecting requests (from scraperwiki->-r requirements.txt (line 1))  Downloading requests-2.7.0-py2.py3-none-any.whl (470kB)  Collecting six>=1.5 (from python-dateutil->-r requirements.txt (line 5))  Downloading six-1.9.0-py2.py3-none-any.whl  Installing collected packages: six, requests, dumptruck, python-dateutil, beautifulsoup4, cssselect, lxml, scraperwiki    Running install for dumptruck    Running install for cssselect  Running install for lxml  Building lxml version 3.4.4.  Building without Cython.  Using build configuration of libxslt 1.1.28  /app/.heroku/python/lib/python2.7/distutils/ UserWarning: Unknown distribution option: 'bugtrack_url'  warnings.warn(msg)  building 'lxml.etree' extension  gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/usr/include/libxml2 -I/tmp/pip-build-XSofYx/lxml/src/lxml/includes -I/app/.heroku/python/include/python2.7 -c src/lxml/lxml.etree.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -w  gcc -pthread -shared build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -L/app/.heroku/python/lib -lxslt -lexslt -lxml2 -lz -lm -lpython2.7 -o build/lib.linux-x86_64-2.7/lxml/  building 'lxml.objectify' extension  gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/usr/include/libxml2 -I/tmp/pip-build-XSofYx/lxml/src/lxml/includes -I/app/.heroku/python/include/python2.7 -c src/lxml/lxml.objectify.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.objectify.o -w  gcc -pthread -shared build/temp.linux-x86_64-2.7/src/lxml/lxml.objectify.o -L/app/.heroku/python/lib -lxslt -lexslt -lxml2 -lz -lm -lpython2.7 -o build/lib.linux-x86_64-2.7/lxml/  Running develop for scraperwiki  Creating /app/.heroku/python/lib/python2.7/site-packages/scraperwiki.egg-link (link to .)  Adding scraperwiki 0.3.7 to easy-install.pth file  Installed /app/.heroku/src/scraperwiki  Successfully installed beautifulsoup4-4.4.0 cssselect-0.9.1 dumptruck-0.1.6 lxml-3.4.4 python-dateutil-2.4.2 requests-2.7.0 scraperwiki six-1.9.0  -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... /app/.heroku/python/lib/python2.7/site-packages/bs4/ UserWarning: No parser was explicitly specified, so I'm using the best available HTML parser for this system ("lxml"). This usually isn't a problem, but if you run this code on another system, or in a different virtual environment, it may use a different parser and behave differently. To get rid of this warning, change this: BeautifulSoup([your markup]) to this: BeautifulSoup([your markup], "lxml") markup_type=markup_type)) E4505_SCMBC_gov_2015_03 E4505_SCMBC_gov_2014_12 E4505_SCMBC_gov_2014_09 E4505_SCMBC_gov_2014_06 E4505_SCMBC_gov_2014_03 E4505_SCMBC_gov_2013_12 E4505_SCMBC_gov_2013_09 E4505_SCMBC_gov_2013_06 E4505_SCMBC_gov_2013_03 E4505_SCMBC_gov_2012_12 E4505_SCMBC_gov_2012_09 E4505_SCMBC_gov_2012_06 E4505_SCMBC_gov_2012_03 E4505_SCMBC_gov_2011_12 E4505_SCMBC_gov_2011_09 E4505_SCMBC_gov_2011_06 E4505_SCMBC_gov_2011_03 E4505_SCMBC_gov_2010_12


Downloaded 0 times

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (10 KB) Use the API

rows 10 / 18

l f d over £500 Q4 Jan to Mar 2015 (CSV).CSV
2015-07-16 00:07:54.388791 over £500 Q3 Oct to Dec 2014 (CSV).CSV
2015-07-16 00:07:58.239053 over £500 Q2 Jul to Sep 2014 (CSV).CSV
2015-07-16 00:08:01.944820 over £500 Q1 Apr to Jun 2014 (CSV).CSV
2015-07-16 00:08:05.557675 over £500 Q4 Jan to Mar 2014 (CSV).CSV
2015-07-16 00:08:09.032860 over £500 Q3 Oct to Dec 2013 (CSV).CSV
2015-07-16 00:08:12.684052 over £500 Q2 Jul to Sep 2013 (CSV).CSV
2015-07-16 00:08:16.184875 over £500 Q1 Apr to Jun 2013 (CSV).CSV
2015-07-16 00:08:19.813138 over £500 Q4 Jan to Mar 2013 (CSV).CSV
2015-07-16 00:08:23.658228 over £500 Q3 Oct to Dec 2012 (CSV).CSV
2015-07-16 00:08:26.985118


Average successful run time: 4 minutes

Total run time: 4 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 37.9 KB


  • Manually ran revision bd4de036 and completed successfully .
    18 records added in the database
    19 pages scraped
  • Created on

Scraper code