blablupcom / E0102_BCC_gov


Bristol City Council Homepage | Bristol City Council

Contributors blablupcom woodbine

Last run completed successfully .

Console output of last run

Injecting configuration and compiling... [1G-----> Python app detected [1G-----> Stack changed, re-installing runtime [1G-----> Installing runtime (python-2.7.6) [1G-----> Installing dependencies with pip [1G Obtaining scraperwiki from git+ (from -r requirements.txt (line 1)) [1G Cloning (to morph_defaults) to ./.heroku/src/scraperwiki [1G Collecting lxml==3.4.4 (from -r requirements.txt (line 2)) [1G Downloading lxml-3.4.4.tar.gz (3.5MB) [1G Building lxml version 3.4.4. [1G Building without Cython. [1G Using build configuration of libxslt 1.1.28 [1G /app/.heroku/python/lib/python2.7/distutils/ UserWarning: Unknown distribution option: 'bugtrack_url' [1G warnings.warn(msg) [1G Collecting cssselect==0.9.1 (from -r requirements.txt (line 3)) [1G Downloading cssselect-0.9.1.tar.gz [1G Collecting beautifulsoup4 (from -r requirements.txt (line 4)) [1G Downloading beautifulsoup4-4.4.0-py2-none-any.whl (81kB) [1G Collecting python-dateutil (from -r requirements.txt (line 5)) [1G Downloading python_dateutil-2.4.2-py2.py3-none-any.whl (188kB) [1G Collecting dumptruck>=0.1.2 (from scraperwiki->-r requirements.txt (line 1)) [1G Downloading dumptruck-0.1.6.tar.gz [1G Collecting requests (from scraperwiki->-r requirements.txt (line 1)) [1G Downloading requests-2.7.0-py2.py3-none-any.whl (470kB) [1G Collecting six>=1.5 (from python-dateutil->-r requirements.txt (line 5)) [1G Downloading six-1.9.0-py2.py3-none-any.whl [1G Installing collected packages: six, requests, dumptruck, python-dateutil, beautifulsoup4, cssselect, lxml, scraperwiki [1G [1G [1G Running install for dumptruck [1G [1G [1G Running install for cssselect [1G Running install for lxml [1G Building lxml version 3.4.4. [1G Building without Cython. [1G Using build configuration of libxslt 1.1.28 [1G /app/.heroku/python/lib/python2.7/distutils/ UserWarning: Unknown distribution option: 'bugtrack_url' [1G warnings.warn(msg) [1G building 'lxml.etree' extension [1G gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/usr/include/libxml2 -I/tmp/pip-build-S3gMnA/lxml/src/lxml/includes -I/app/.heroku/python/include/python2.7 -c src/lxml/lxml.etree.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -w [1G gcc -pthread -shared build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -L/app/.heroku/python/lib -lxslt -lexslt -lxml2 -lz -lm -lpython2.7 -o build/lib.linux-x86_64-2.7/lxml/ [1G building 'lxml.objectify' extension [1G gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/usr/include/libxml2 -I/tmp/pip-build-S3gMnA/lxml/src/lxml/includes -I/app/.heroku/python/include/python2.7 -c src/lxml/lxml.objectify.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.objectify.o -w [1G gcc -pthread -shared build/temp.linux-x86_64-2.7/src/lxml/lxml.objectify.o -L/app/.heroku/python/lib -lxslt -lexslt -lxml2 -lz -lm -lpython2.7 -o build/lib.linux-x86_64-2.7/lxml/ [1G Running develop for scraperwiki [1G Creating /app/.heroku/python/lib/python2.7/site-packages/scraperwiki.egg-link (link to .) [1G Adding scraperwiki 0.3.7 to easy-install.pth file [1G Installed /app/.heroku/src/scraperwiki [1G Successfully installed beautifulsoup4-4.4.0 cssselect-0.9.1 dumptruck-0.1.6 lxml-3.4.4 python-dateutil-2.4.2 requests-2.7.0 scraperwiki six-1.9.0 [1G [1G-----> Discovering process types [1G Procfile declares types -> scraper Injecting scraper and running... /app/.heroku/python/lib/python2.7/site-packages/bs4/ UserWarning: No parser was explicitly specified, so I'm using the best available HTML parser for this system ("lxml"). This usually isn't a problem, but if you run this code on another system, or in a different virtual environment, it may use a different parser and behave differently. To get rid of this warning, change this: BeautifulSoup([your markup]) to this: BeautifulSoup([your markup], "lxml") markup_type=markup_type)) E0102_BCC_gov_2015_06 E0102_BCC_gov_2015_05 E0102_BCC_gov_2015_04 E0102_BCC_gov_2015_03 E0102_BCC_gov_2015_02 E0102_BCC_gov_2015_01 E0102_BCC_gov_2014_12 E0102_BCC_gov_2014_11 E0102_BCC_gov_2014_10 E0102_BCC_gov_2014_09 E0102_BCC_gov_2014_08 E0102_BCC_gov_2014_07 E0102_BCC_gov_2014_06 E0102_BCC_gov_2014_05 E0102_BCC_gov_2014_04 E0102_BCC_gov_2014_03 E0102_BCC_gov_2014_02 E0102_BCC_gov_2014_01 E0102_BCC_gov_2013_12 E0102_BCC_gov_2013_11 E0102_BCC_gov_2013_10 E0102_BCC_gov_2013_09 E0102_BCC_gov_2013_08 E0102_BCC_gov_2013_07 E0102_BCC_gov_2013_06 E0102_BCC_gov_2013_05 E0102_BCC_gov_2013_04 E0102_BCC_gov_2013_03 E0102_BCC_gov_2013_02 E0102_BCC_gov_2013_01 E0102_BCC_gov_2012_12 E0102_BCC_gov_2012_11 E0102_BCC_gov_2012_10 E0102_BCC_gov_2012_09 E0102_BCC_gov_2012_08 E0102_BCC_gov_2012_07 E0102_BCC_gov_2012_06 E0102_BCC_gov_2012_05 E0102_BCC_gov_2012_04 E0102_BCC_gov_2012_03 E0102_BCC_gov_2012_02 E0102_BCC_gov_2012_01 E0102_BCC_gov_2011_12 E0102_BCC_gov_2011_11 E0102_BCC_gov_2011_10 E0102_BCC_gov_2011_09 E0102_BCC_gov_2011_08 E0102_BCC_gov_2011_07 E0102_BCC_gov_2011_06 E0102_BCC_gov_2011_05 E0102_BCC_gov_2011_04 E0102_BCC_gov_2011_03 E0102_BCC_gov_2011_02 E0102_BCC_gov_2011_01 E0102_BCC_gov_2010_12 E0102_BCC_gov_2010_11 E0102_BCC_gov_2010_10 E0102_BCC_gov_2010_09 E0102_BCC_gov_2010_08


Downloaded 3 times by MikeRalphson

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (12 KB) Use the API

rows 10 / 59

l f d
2015-07-14 23:00:42.661563
2015-07-14 23:00:45.684774
2015-07-14 23:00:48.191428
2015-07-14 23:00:50.713566
2015-07-14 23:00:54.079281
2015-07-14 23:00:56.847626
2015-07-14 23:00:59.666126
2015-07-14 23:01:02.995373
2015-07-14 23:01:05.867994
2015-07-14 23:01:11.289477


Average successful run time: 7 minutes

Total run time: 8 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 41.2 KB


  • Manually ran revision 6312cbe6 and completed successfully .
    59 records added in the database
    60 pages scraped
  • Manually ran revision 29268a2f and failed .
    nothing changed in the database
  • Created on

Scraper code