Contributors blablupcom

Last run completed successfully .

Console output of last run

Injecting configuration and compiling... [1G [1G-----> Python app detected [1G-----> Installing python-2.7.6 [1G $ pip install -r requirements.txt [1G Obtaining scraperwiki from git+ (from -r requirements.txt (line 1)) [1G Cloning (to morph_defaults) to ./.heroku/src/scraperwiki [1G Collecting lxml==3.4.4 (from -r requirements.txt (line 2)) [1G /app/.heroku/python/lib/python2.7/site-packages/pip-8.1.2-py2.7.egg/pip/_vendor/requests/packages/urllib3/util/ SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see [1G SNIMissingWarning [1G /app/.heroku/python/lib/python2.7/site-packages/pip-8.1.2-py2.7.egg/pip/_vendor/requests/packages/urllib3/util/ InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see [1G InsecurePlatformWarning [1G Downloading lxml-3.4.4.tar.gz (3.5MB) [1G Collecting cssselect==0.9.1 (from -r requirements.txt (line 3)) [1G Downloading cssselect-0.9.1.tar.gz [1G Collecting beautifulsoup4 (from -r requirements.txt (line 4)) [1G Downloading beautifulsoup4-4.5.1-py2-none-any.whl (83kB) [1G Collecting dumptruck>=0.1.2 (from scraperwiki->-r requirements.txt (line 1)) [1G Downloading dumptruck-0.1.6.tar.gz [1G Collecting requests (from scraperwiki->-r requirements.txt (line 1)) [1G Downloading requests-2.12.1-py2.py3-none-any.whl (574kB) [1G Installing collected packages: dumptruck, requests, scraperwiki, lxml, cssselect, beautifulsoup4 [1G Running install for dumptruck: started [1G Running install for dumptruck: finished with status 'done' [1G Running develop for scraperwiki [1G Running install for lxml: started [1G Running install for lxml: still running... [1G Running install for lxml: finished with status 'done' [1G Running install for cssselect: started [1G Running install for cssselect: finished with status 'done' [1G Successfully installed beautifulsoup4-4.5.1 cssselect-0.9.1 dumptruck-0.1.6 lxml-3.4.4 requests-2.12.1 scraperwiki [1G [1G ! Hello! It looks like your application is using an outdated version of Python. [1G ! This caused the security warning you saw above during the 'pip install' step. [1G ! We recommend 'python-2.7.12', which you can specify in a 'runtime.txt' file. [1G ! -- Much Love, Heroku. [1G [1G [1G-----> Discovering process types [1G Procfile declares types -> scraper Injecting scraper and running... HCA085_HACA_gov_2015_03 HCA085_HACA_gov_2015_02 HCA085_HACA_gov_2015_01 HCA085_HACA_gov_2014_12 HCA085_HACA_gov_2014_11 HCA085_HACA_gov_2014_10 HCA085_HACA_gov_2014_09 HCA085_HACA_gov_2014_08 HCA085_HACA_gov_2014_07 HCA085_HACA_gov_2014_06 HCA085_HACA_gov_2014_05 HCA085_HACA_gov_2014_04 HCA085_HACA_gov_2016_03 HCA085_HACA_gov_2016_02 HCA085_HACA_gov_2016_01 HCA085_HACA_gov_2015_12 HCA085_HACA_gov_2015_11 HCA085_HACA_gov_2015_10 HCA085_HACA_gov_2015_09 HCA085_HACA_gov_2015_08 HCA085_HACA_gov_2015_07 HCA085_HACA_gov_2015_06 HCA085_HACA_gov_2015_05 HCA085_HACA_gov_2015_04 HCA085_HACA_gov_2016_04


Downloaded 0 times

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (11 KB) Use the API

rows 10 / 25

f l d
2016-11-29 11:15:34.277905
2016-11-29 11:15:37.481181
2016-11-29 11:15:40.360782
2016-11-29 11:15:43.400279
2016-11-29 11:15:46.600858
2016-11-29 11:15:52.084707
2016-11-29 11:15:55.372278
2016-11-29 11:15:58.425220
2016-11-29 11:16:03.910488
2016-11-29 11:16:12.823812


Average successful run time: 3 minutes

Total run time: 3 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 38.2 KB


  • Manually ran revision e9b946aa and completed successfully .
    25 records added in the database
    29 pages scraped
  • Created on

Scraper code