Contributors blablupcom

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-2.7.14 -----> Installing pip -----> Installing requirements with pip  Obtaining scraperwiki from git+ (from -r /tmp/build/requirements.txt (line 1))  Cloning (to morph_defaults) to /app/.heroku/src/scraperwiki  Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 2))  Downloading lxml-3.4.4.tar.gz (3.5MB)  Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 3))  Downloading cssselect-0.9.1.tar.gz  Collecting beautifulsoup4 (from -r /tmp/build/requirements.txt (line 4))  Downloading beautifulsoup4-4.6.0-py2-none-any.whl (86kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading dumptruck-0.1.6.tar.gz  Collecting requests (from scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading requests-2.18.4-py2.py3-none-any.whl (88kB)  Collecting idna<2.7,>=2.5 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading idna-2.6-py2.py3-none-any.whl (56kB)  Collecting urllib3<1.23,>=1.21.1 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading urllib3-1.22-py2.py3-none-any.whl (132kB)  Collecting certifi>=2017.4.17 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading certifi-2018.1.18-py2.py3-none-any.whl (151kB)  Collecting chardet<3.1.0,>=3.0.2 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading chardet-3.0.4-py2.py3-none-any.whl (133kB)  Installing collected packages: dumptruck, idna, urllib3, certifi, chardet, requests, scraperwiki, lxml, cssselect, beautifulsoup4  Running install for dumptruck: started  Running install for dumptruck: finished with status 'done'  Running develop for scraperwiki  Running install for lxml: started  Running install for lxml: still running...  Running install for lxml: finished with status 'done'  Running install for cssselect: started  Running install for cssselect: finished with status 'done'  Successfully installed beautifulsoup4-4.6.0 certifi-2018.1.18 chardet-3.0.4 cssselect-0.9.1 dumptruck-0.1.6 idna-2.6 lxml-3.4.4 requests-2.18.4 scraperwiki urllib3-1.22   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... FTTAHX_SHASCNFT_gov_2017_04 FTTAHX_SHASCNFT_gov_2017_05 FTTAHX_SHASCNFT_gov_2017_06 FTTAHX_SHASCNFT_gov_2017_07 FTTAHX_SHASCNFT_gov_2017_08 FTTAHX_SHASCNFT_gov_2017_09 FTTAHX_SHASCNFT_gov_2017_10 FTTAHX_SHASCNFT_gov_2017_11 FTTAHX_SHASCNFT_gov_2017_12 FTTAHX_SHASCNFT_gov_2018_01 FTTAHX_SHASCNFT_gov_2016_Y1 FTTAHX_SHASCNFT_gov_2015_Y1 FTTAHX_SHASCNFT_gov_2014_Y1


Downloaded 0 times

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (8 KB) Use the API

rows 10 / 13

d f l
2018-03-30 20:14:30.706993
2018-03-30 20:14:31.131079
2018-03-30 20:14:31.508795
2018-03-30 20:14:31.891527
2018-03-30 20:14:32.284575
2018-03-30 20:14:32.644600
2018-03-30 20:14:33.053661
2018-03-30 20:14:33.452750
2018-03-30 20:14:33.828820
2018-03-30 20:14:34.186573


Average successful run time: 2 minutes

Total run time: 2 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 64 KB


  • Manually ran revision 2c9a19ea and completed successfully .
    13 records added in the database
    14 pages scraped
  • Created on