Contributors blablupcom

Last run completed successfully .

Console output of last run

Injecting configuration and compiling... -----> Python app detected -----> Stack changed, re-installing runtime -----> Installing runtime (python-2.7.6) -----> Installing dependencies with pip  Obtaining scraperwiki from git+ (from -r requirements.txt (line 1))  Cloning (to morph_defaults) to ./.heroku/src/scraperwiki  Collecting lxml==3.4.4 (from -r requirements.txt (line 2))  Downloading lxml-3.4.4.tar.gz (3.5MB)  Building lxml version 3.4.4.  Building without Cython.  Using build configuration of libxslt 1.1.28  /app/.heroku/python/lib/python2.7/distutils/ UserWarning: Unknown distribution option: 'bugtrack_url'  warnings.warn(msg)  Collecting cssselect==0.9.1 (from -r requirements.txt (line 3))  Downloading cssselect-0.9.1.tar.gz  Collecting beautifulsoup4 (from -r requirements.txt (line 4))  Downloading beautifulsoup4-4.4.0-py2-none-any.whl (81kB)  Collecting python-dateutil (from -r requirements.txt (line 5))  Downloading python_dateutil-2.4.2-py2.py3-none-any.whl (188kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r requirements.txt (line 1))  Downloading dumptruck-0.1.6.tar.gz  Collecting requests (from scraperwiki->-r requirements.txt (line 1))  Downloading requests-2.7.0-py2.py3-none-any.whl (470kB)  Collecting six>=1.5 (from python-dateutil->-r requirements.txt (line 5))  Downloading six-1.9.0-py2.py3-none-any.whl  Installing collected packages: six, requests, dumptruck, python-dateutil, beautifulsoup4, cssselect, lxml, scraperwiki    Running install for dumptruck    Running install for cssselect  Running install for lxml  Building lxml version 3.4.4.  Building without Cython.  Using build configuration of libxslt 1.1.28  /app/.heroku/python/lib/python2.7/distutils/ UserWarning: Unknown distribution option: 'bugtrack_url'  warnings.warn(msg)  building 'lxml.etree' extension  gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/usr/include/libxml2 -I/tmp/pip-build-NbAMd3/lxml/src/lxml/includes -I/app/.heroku/python/include/python2.7 -c src/lxml/lxml.etree.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -w  gcc -pthread -shared build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -L/app/.heroku/python/lib -lxslt -lexslt -lxml2 -lz -lm -lpython2.7 -o build/lib.linux-x86_64-2.7/lxml/  building 'lxml.objectify' extension  gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/usr/include/libxml2 -I/tmp/pip-build-NbAMd3/lxml/src/lxml/includes -I/app/.heroku/python/include/python2.7 -c src/lxml/lxml.objectify.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.objectify.o -w  gcc -pthread -shared build/temp.linux-x86_64-2.7/src/lxml/lxml.objectify.o -L/app/.heroku/python/lib -lxslt -lexslt -lxml2 -lz -lm -lpython2.7 -o build/lib.linux-x86_64-2.7/lxml/  Running develop for scraperwiki  Creating /app/.heroku/python/lib/python2.7/site-packages/scraperwiki.egg-link (link to .)  Adding scraperwiki 0.3.7 to easy-install.pth file  Installed /app/.heroku/src/scraperwiki  Successfully installed beautifulsoup4-4.4.0 cssselect-0.9.1 dumptruck-0.1.6 lxml-3.4.4 python-dateutil-2.4.2 requests-2.7.0 scraperwiki six-1.9.0  -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... /app/.heroku/python/lib/python2.7/site-packages/bs4/ UserWarning: No parser was explicitly specified, so I'm using the best available HTML parser for this system ("lxml"). This usually isn't a problem, but if you run this code on another system, or in a different virtual environment, it may use a different parser and behave differently. To get rid of this warning, change this: BeautifulSoup([your markup]) to this: BeautifulSoup([your markup], "lxml") markup_type=markup_type)) E4304_SMBC_gov_2015_02 E4304_SMBC_gov_2015_01 E4304_SMBC_gov_2014_12 E4304_SMBC_gov_2014_11 E4304_SMBC_gov_2014_10 E4304_SMBC_gov_2014_09 E4304_SMBC_gov_2014_07 E4304_SMBC_gov_2014_06 E4304_SMBC_gov_2014_05 E4304_SMBC_gov_2014_04 E4304_SMBC_gov_2014_03 E4304_SMBC_gov_2014_02 E4304_SMBC_gov_2014_01 E4304_SMBC_gov_2013_12 E4304_SMBC_gov_2013_11 E4304_SMBC_gov_2013_10 E4304_SMBC_gov_2013_09 E4304_SMBC_gov_2013_08 E4304_SMBC_gov_2013_07 E4304_SMBC_gov_2013_06 E4304_SMBC_gov_2013_05 E4304_SMBC_gov_2013_04


Downloaded 1 time by MikeRalphson

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (9 KB) Use the API

rows 10 / 22

d l f
2015-07-16 00:09:10.999385
2015-07-16 00:09:14.478940
2015-07-16 00:09:16.745625
2015-07-16 00:09:19.302703
2015-07-16 00:09:21.561923
2015-07-16 00:09:23.834434
2015-07-16 00:09:26.030990
2015-07-16 00:09:28.173671
2015-07-16 00:09:30.574972
2015-07-16 00:09:32.786070


Average successful run time: 4 minutes

Total run time: 4 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 32.1 KB


  • Manually ran revision 036f1a56 and completed successfully .
    22 records added in the database
    45 pages scraped
  • Created on

Scraper code


E4304_SMBC_gov /