Contributors blablupcom

Last run completed successfully .

Console output of last run

Injecting configuration and compiling... [1G [1G-----> Python app detected [1G-----> Installing python-2.7.6 [1G $ pip install -r requirements.txt [1G Obtaining scraperwiki from git+ (from -r /tmp/build/requirements.txt (line 1)) [1G Cloning (to morph_defaults) to /app/.heroku/src/scraperwiki [1G Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 2)) [1G /app/.heroku/python/lib/python2.7/site-packages/pip-9.0.1-py2.7.egg/pip/_vendor/requests/packages/urllib3/util/ SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see [1G SNIMissingWarning [1G /app/.heroku/python/lib/python2.7/site-packages/pip-9.0.1-py2.7.egg/pip/_vendor/requests/packages/urllib3/util/ InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see [1G InsecurePlatformWarning [1G Downloading lxml-3.4.4.tar.gz (3.5MB) [1G Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 3)) [1G Downloading cssselect-0.9.1.tar.gz [1G Collecting beautifulsoup4 (from -r /tmp/build/requirements.txt (line 4)) [1G Downloading beautifulsoup4-4.6.0-py2-none-any.whl (86kB) [1G Collecting python-dateutil (from -r /tmp/build/requirements.txt (line 5)) [1G Downloading python_dateutil-2.6.1-py2.py3-none-any.whl (194kB) [1G Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 1)) [1G Downloading dumptruck-0.1.6.tar.gz [1G Collecting requests (from scraperwiki->-r /tmp/build/requirements.txt (line 1)) [1G Downloading requests-2.18.4-py2.py3-none-any.whl (88kB) [1G Collecting six>=1.5 (from python-dateutil->-r /tmp/build/requirements.txt (line 5)) [1G Downloading six-1.11.0-py2.py3-none-any.whl [1G Collecting chardet<3.1.0,>=3.0.2 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1)) [1G Downloading chardet-3.0.4-py2.py3-none-any.whl (133kB) [1G Collecting certifi>=2017.4.17 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1)) [1G Downloading certifi-2017.11.5-py2.py3-none-any.whl (330kB) [1G Collecting urllib3<1.23,>=1.21.1 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1)) [1G Downloading urllib3-1.22-py2.py3-none-any.whl (132kB) [1G Collecting idna<2.7,>=2.5 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1)) [1G Downloading idna-2.6-py2.py3-none-any.whl (56kB) [1G Installing collected packages: dumptruck, chardet, certifi, urllib3, idna, requests, scraperwiki, lxml, cssselect, beautifulsoup4, six, python-dateutil [1G Running install for dumptruck: started [1G Running install for dumptruck: finished with status 'done' [1G Running develop for scraperwiki [1G Running install for lxml: started [1G Running install for lxml: still running... [1G Running install for lxml: finished with status 'done' [1G Running install for cssselect: started [1G Running install for cssselect: finished with status 'done' [1G Successfully installed beautifulsoup4-4.6.0 certifi-2017.11.5 chardet-3.0.4 cssselect-0.9.1 dumptruck-0.1.6 idna-2.6 lxml-3.4.4 python-dateutil-2.6.1 requests-2.18.4 scraperwiki six-1.11.0 urllib3-1.22 [1G [1G ! Hello! It looks like your application is using an outdated version of Python. [1G ! This caused the security warning you saw above during the 'pip install' step. [1G ! We recommend 'python-2.7.13', which you can specify in a 'runtime.txt' file. [1G ! -- Much Love, Heroku. [1G [1G [1G-----> Discovering process types [1G Procfile declares types -> scraper Injecting scraper and running... E1701_PCC_gov_2017_09 E1701_PCC_gov_2017_08 E1701_PCC_gov_2017_07 E1701_PCC_gov_2017_06 E1701_PCC_gov_2017_05 E1701_PCC_gov_2017_04 E1701_PCC_gov_2017_03 E1701_PCC_gov_2017_02 E1701_PCC_gov_2017_01 E1701_PCC_gov_2016_12 E1701_PCC_gov_2016_11 E1701_PCC_gov_2016_10 E1701_PCC_gov_2016_09 E1701_PCC_gov_2016_08 E1701_PCC_gov_2016_07 E1701_PCC_gov_2016_06 E1701_PCC_gov_2016_05 E1701_PCC_gov_2016_04 E1701_PCC_gov_2016_03 E1701_PCC_gov_2016_02 E1701_PCC_gov_2016_01 E1701_PCC_gov_2015_12 E1701_PCC_gov_2015_11 E1701_PCC_gov_2015_10 E1701_PCC_gov_2015_09 E1701_PCC_gov_2015_08 E1701_PCC_gov_2015_07 E1701_PCC_gov_2015_06 E1701_PCC_gov_2015_05 E1701_PCC_gov_2015_04 E1701_PCC_gov_2015_03 E1701_PCC_gov_2015_02 E1701_PCC_gov_2015_01 E1701_PCC_gov_2014_12 E1701_PCC_gov_2014_11 E1701_PCC_gov_2014_10 E1701_PCC_gov_2014_09 E1701_PCC_gov_2014_08 E1701_PCC_gov_2014_07 E1701_PCC_gov_2014_06 E1701_PCC_gov_2014_05 E1701_PCC_gov_2014_04 E1701_PCC_gov_2014_03 E1701_PCC_gov_2014_02 E1701_PCC_gov_2014_01 E1701_PCC_gov_2013_12 E1701_PCC_gov_2013_11 E1701_PCC_gov_2013_10 E1701_PCC_gov_2013_09 E1701_PCC_gov_2013_08 E1701_PCC_gov_2013_07 E1701_PCC_gov_2013_06 E1701_PCC_gov_2013_05 E1701_PCC_gov_2013_04 E1701_PCC_gov_2013_03 E1701_PCC_gov_2013_02 E1701_PCC_gov_2013_01 E1701_PCC_gov_2012_12 E1701_PCC_gov_2012_11 E1701_PCC_gov_2012_10 E1701_PCC_gov_2012_09 E1701_PCC_gov_2012_08 E1701_PCC_gov_2012_07 E1701_PCC_gov_2012_06 E1701_PCC_gov_2012_05 E1701_PCC_gov_2012_04 E1701_PCC_gov_2012_03 E1701_PCC_gov_2012_02 E1701_PCC_gov_2012_01 E1701_PCC_gov_2011_12 E1701_PCC_gov_2011_11 E1701_PCC_gov_2011_10 E1701_PCC_gov_2011_09 E1701_PCC_gov_2011_08 E1701_PCC_gov_2011_07 E1701_PCC_gov_2011_06 E1701_PCC_gov_2011_05 E1701_PCC_gov_2011_04 E1701_PCC_gov_2011_03 E1701_PCC_gov_2011_02 E1701_PCC_gov_2011_01


Downloaded 4 times by MikeRalphson

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (19 KB) Use the API

rows 10 / 56


Average successful run time: 4 minutes

Total run time: 8 minutes

Total cpu time used: less than 10 seconds

Total disk space used: 44.4 KB


  • Manually ran revision b1ffa0f6 and completed successfully .
    nothing changed in the database
  • Manually ran revision 228e6585 and completed successfully .
    56 records added in the database
    54 pages scraped
  • Created on

Scraper code