Contributors blablupcom

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-2.7.14 -----> Installing pip -----> Installing requirements with pip  Obtaining scraperwiki from git+ (from -r /tmp/build/requirements.txt (line 1))  Cloning (to revision morph_defaults) to /app/.heroku/src/scraperwiki  Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 2))  Downloading (3.5MB)  Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 3))  Downloading  Collecting beautifulsoup4 (from -r /tmp/build/requirements.txt (line 4))  Downloading (86kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading  Collecting requests (from scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading (88kB)  Collecting idna<2.7,>=2.5 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading (56kB)  Collecting urllib3<1.23,>=1.21.1 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading (132kB)  Collecting certifi>=2017.4.17 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading (150kB)  Collecting chardet<3.1.0,>=3.0.2 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading (133kB)  Installing collected packages: dumptruck, idna, urllib3, certifi, chardet, requests, scraperwiki, lxml, cssselect, beautifulsoup4  Running install for dumptruck: started  Running install for dumptruck: finished with status 'done'  Running develop for scraperwiki  Running install for lxml: started  Running install for lxml: still running...  Running install for lxml: finished with status 'done'  Running install for cssselect: started  Running install for cssselect: finished with status 'done'  Successfully installed beautifulsoup4-4.6.0 certifi-2018.4.16 chardet-3.0.4 cssselect-0.9.1 dumptruck-0.1.6 idna-2.6 lxml-3.4.4 requests-2.18.4 scraperwiki urllib3-1.22   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... NFTRYV_CCSNFT_gov_2018_Q0 NFTRYV_CCSNFT_gov_2018_Q0 NFTRYV_CCSNFT_gov_2017_11 NFTRYV_CCSNFT_gov_2017_10 NFTRYV_CCSNFT_gov_2017_09 NFTRYV_CCSNFT_gov_2017_08 NFTRYV_CCSNFT_gov_2017_07 NFTRYV_CCSNFT_gov_2018_03 NFTRYV_CCSNFT_gov_2017_Q2 NFTRYV_CCSNFT_gov_2017_03 NFTRYV_CCSNFT_gov_2017_02 NFTRYV_CCSNFT_gov_2017_01 NFTRYV_CCSNFT_gov_2016_12 NFTRYV_CCSNFT_gov_2016_11 NFTRYV_CCSNFT_gov_2016_10 NFTRYV_CCSNFT_gov_2016_09 NFTRYV_CCSNFT_gov_2016_08 NFTRYV_CCSNFT_gov_2016_07 NFTRYV_CCSNFT_gov_2016_06 NFTRYV_CCSNFT_gov_2016_05 NFTRYV_CCSNFT_gov_2016_03 NFTRYV_CCSNFT_gov_2016_01 NFTRYV_CCSNFT_gov_2016_02 NFTRYV_CCSNFT_gov_2016_04 NFTRYV_CCSNFT_gov_2015_08 NFTRYV_CCSNFT_gov_2015_12 NFTRYV_CCSNFT_gov_2015_09 NFTRYV_CCSNFT_gov_2015_10 NFTRYV_CCSNFT_gov_2015_11 NFTRYV_CCSNFT_gov_2015_05 NFTRYV_CCSNFT_gov_2015_Q0 NFTRYV_CCSNFT_gov_2015_Q1 NFTRYV_CCSNFT_gov_2015_04


Downloaded 267 times by SimKennedy woodbine

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (15 KB) Use the API

rows 10 / 33

d f l
2018-05-02 14:03:06.147233
2018-05-02 14:03:06.963551
2018-05-02 14:03:08.146217
2018-05-02 14:03:08.917532
2018-05-02 14:03:09.761997
2018-05-02 14:03:10.621563
2018-05-02 14:03:11.443006
2018-05-02 14:03:12.458738
2018-05-02 14:03:13.423188
2018-05-02 14:03:14.428357


Average successful run time: 2 minutes

Total run time: 5 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 66.9 KB


  • Manually ran revision 8542bac9 and completed successfully .
    33 records added in the database
    34 pages scraped
  • Manually ran revision 223dac84 and failed .
    32 records added, 32 records removed in the database
    34 pages scraped
  • Manually ran revision 223dac84 and failed .
    32 records added in the database
    34 pages scraped
  • Created on

Scraper code