This is a scraper that runs on Morph. To get started see the documentation

Contributors blablupcom

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-2.7.9  $ pip install -r requirements.txt  Obtaining scraperwiki from git+ (from -r requirements.txt (line 1))  Cloning (to morph_defaults) to ./.heroku/src/scraperwiki  Collecting lxml==3.4.4 (from -r requirements.txt (line 2))  Downloading lxml-3.4.4.tar.gz (3.5MB)  Collecting cssselect==0.9.1 (from -r requirements.txt (line 3))  Downloading cssselect-0.9.1.tar.gz  Collecting beautifulsoup4 (from -r requirements.txt (line 4))  Downloading beautifulsoup4-4.5.1-py2-none-any.whl (83kB)  Collecting python-dateutil (from -r requirements.txt (line 5))  Downloading python_dateutil-2.6.0-py2.py3-none-any.whl (194kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r requirements.txt (line 1))  Downloading dumptruck-0.1.6.tar.gz  Collecting requests (from scraperwiki->-r requirements.txt (line 1))  Downloading requests-2.12.1-py2.py3-none-any.whl (574kB)  Collecting six>=1.5 (from python-dateutil->-r requirements.txt (line 5))  Downloading six-1.10.0-py2.py3-none-any.whl  Installing collected packages: dumptruck, requests, scraperwiki, lxml, cssselect, beautifulsoup4, six, python-dateutil  Running install for dumptruck: started  Running install for dumptruck: finished with status 'done'  Running develop for scraperwiki  Running install for lxml: started  Running install for lxml: still running...  Running install for lxml: finished with status 'done'  Running install for cssselect: started  Running install for cssselect: finished with status 'done'  Successfully installed beautifulsoup4-4.5.1 cssselect-0.9.1 dumptruck-0.1.6 lxml-3.4.4 python-dateutil-2.6.0 requests-2.12.1 scraperwiki six-1.10.0   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... E0104_NSC_gov_2016_Q0


Downloaded 3 times by MikeRalphson

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (9 KB) Use the API

rows 10 / 17

d l f
2016-04-12 07:49:48.133353
2016-04-12 07:49:49.708472
2016-04-12 07:49:51.262491
2016-04-12 07:49:52.813813
2016-04-12 07:49:54.415955
2016-04-12 07:49:55.988081
2016-04-12 07:49:57.516706
2016-04-12 07:49:59.074018
2016-04-12 07:50:00.638793
2016-04-12 07:50:02.173779


Average successful run time: 2 minutes

Total run time: 13 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 35.6 KB


  • Manually ran revision 34fa60d8 and completed successfully .
    1 record added in the database
    3 pages scraped
  • Manually ran revision fd37850e and completed successfully .
    16 records added in the database
  • Manually ran revision 956427b9 and failed .
    nothing changed in the database
  • Manually ran revision b692d8fc and failed .
    nothing changed in the database
  • Manually ran revision 047b8690 and failed .
    nothing changed in the database
  • ...
  • Created on

Show complete history

Scraper code


sp_E0104_NSC_gov /