Contributors blablupcom

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-2.7.6  $ pip install -r requirements.txt  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 1))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to morph_defaults) to /app/.heroku/src/scraperwiki  Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 2))  /app/.heroku/python/lib/python2.7/site-packages/pip-9.0.1-py2.7.egg/pip/_vendor/requests/packages/urllib3/util/ssl_.py:318: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/security.html#snimissingwarning.  SNIMissingWarning  /app/.heroku/python/lib/python2.7/site-packages/pip-9.0.1-py2.7.egg/pip/_vendor/requests/packages/urllib3/util/ssl_.py:122: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/security.html#insecureplatformwarning.  InsecurePlatformWarning  Downloading lxml-3.4.4.tar.gz (3.5MB)  Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 3))  Downloading cssselect-0.9.1.tar.gz  Collecting beautifulsoup4 (from -r /tmp/build/requirements.txt (line 4))  Downloading beautifulsoup4-4.6.0-py2-none-any.whl (86kB)  Collecting python-dateutil (from -r /tmp/build/requirements.txt (line 5))  Downloading python_dateutil-2.6.1-py2.py3-none-any.whl (194kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading dumptruck-0.1.6.tar.gz  Collecting requests (from scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading requests-2.18.4-py2.py3-none-any.whl (88kB)  Collecting six>=1.5 (from python-dateutil->-r /tmp/build/requirements.txt (line 5))  Downloading six-1.11.0-py2.py3-none-any.whl  Collecting chardet<3.1.0,>=3.0.2 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading chardet-3.0.4-py2.py3-none-any.whl (133kB)  Collecting certifi>=2017.4.17 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading certifi-2017.11.5-py2.py3-none-any.whl (330kB)  Collecting urllib3<1.23,>=1.21.1 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading urllib3-1.22-py2.py3-none-any.whl (132kB)  Collecting idna<2.7,>=2.5 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading idna-2.6-py2.py3-none-any.whl (56kB)  Installing collected packages: dumptruck, chardet, certifi, urllib3, idna, requests, scraperwiki, lxml, cssselect, beautifulsoup4, six, python-dateutil  Running setup.py install for dumptruck: started  Running setup.py install for dumptruck: finished with status 'done'  Running setup.py develop for scraperwiki  Running setup.py install for lxml: started  Running setup.py install for lxml: still running...  Running setup.py install for lxml: finished with status 'done'  Running setup.py install for cssselect: started  Running setup.py install for cssselect: finished with status 'done'  Successfully installed beautifulsoup4-4.6.0 certifi-2017.11.5 chardet-3.0.4 cssselect-0.9.1 dumptruck-0.1.6 idna-2.6 lxml-3.4.4 python-dateutil-2.6.1 requests-2.18.4 scraperwiki six-1.11.0 urllib3-1.22   ! Hello! It looks like your application is using an outdated version of Python.  ! This caused the security warning you saw above during the 'pip install' step.  ! We recommend 'python-2.7.13', which you can specify in a 'runtime.txt' file.  ! -- Much Love, Heroku.   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... E2901_NUA_gov_2017_09 E2901_NUA_gov_2017_08 E2901_NUA_gov_2017_07 E2901_NUA_gov_2017_06 E2901_NUA_gov_2017_05 E2901_NUA_gov_2017_04 E2901_NUA_gov_2017_03 E2901_NUA_gov_2017_02 E2901_NUA_gov_2017_01 E2901_NUA_gov_2016_12 E2901_NUA_gov_2016_11 E2901_NUA_gov_2016_10 E2901_NUA_gov_2016_09 E2901_NUA_gov_2016_08 E2901_NUA_gov_2016_07 E2901_NUA_gov_2016_06 E2901_NUA_gov_2016_06 E2901_NUA_gov_2016_05 E2901_NUA_gov_2016_04 E2901_NUA_gov_2016_03 E2901_NUA_gov_2016_02 E2901_NUA_gov_2016_01 E2901_NUA_gov_2015_12 E2901_NUA_gov_2015_11 E2901_NUA_gov_2015_10 E2901_NUA_gov_2015_09 E2901_NUA_gov_2015_08 E2901_NUA_gov_2015_07 E2901_NUA_gov_2015_06 E2901_NUA_gov_2015_05 E2901_NUA_gov_2015_04 E2901_NUA_gov_2015_03 E2901_NUA_gov_2015_02 E2901_NUA_gov_2015_01 E2901_NUA_gov_2014_12 E2901_NUA_gov_2014_11 E2901_NUA_gov_2014_10 E2901_NUA_gov_2014_09 E2901_NUA_gov_2014_08 E2901_NUA_gov_2014_07 E2901_NUA_gov_2014_06 E2901_NUA_gov_2014_05 E2901_NUA_gov_2014_04 E2901_NUA_gov_2014_03 E2901_NUA_gov_2014_02 E2901_NUA_gov_2014_01 E2901_NUA_gov_2013_12 E2901_NUA_gov_2013_11 E2901_NUA_gov_2013_10 E2901_NUA_gov_2013_09 E2901_NUA_gov_2013_08 E2901_NUA_gov_2013_07 E2901_NUA_gov_2013_06 E2901_NUA_gov_2013_05 E2901_NUA_gov_2013_04 E2901_NUA_gov_2013_03 E2901_NUA_gov_2013_02 E2901_NUA_gov_2013_01 E2901_NUA_gov_2012_12 E2901_NUA_gov_2012_11 E2901_NUA_gov_2012_10 E2901_NUA_gov_2012_09 E2901_NUA_gov_2012_08 E2901_NUA_gov_2012_07 E2901_NUA_gov_2012_06 E2901_NUA_gov_2012_05 E2901_NUA_gov_2012_04 E2901_NUA_gov_2012_03 E2901_NUA_gov_2012_02 E2901_NUA_gov_2012_01 E2901_NUA_gov_2011_12 E2901_NUA_gov_2011_11 E2901_NUA_gov_2011_10 E2901_NUA_gov_2011_09 E2901_NUA_gov_2011_08 E2901_NUA_gov_2011_07 E2901_NUA_gov_2011_06 E2901_NUA_gov_2011_05 E2901_NUA_gov_2011_04 E2901_NUA_gov_2011_03 E2901_NUA_gov_2011_02 E2901_NUA_gov_2011_01 E2901_NUA_gov_2010_12

Data

Downloaded 4 times by MikeRalphson woodbine

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (46 KB) Use the API

rows 10 / 122

Statistics

Average successful run time: 3 minutes

Total run time: 20 minutes

Total cpu time used: less than 10 seconds

Total disk space used: 79.2 KB

History

  • Manually ran revision ff526a04 and completed successfully .
    nothing changed in the database
  • Manually ran revision 86b6e8e1 and completed successfully .
    69 records added, 69 records removed in the database
    72 pages scraped
  • Manually ran revision 86b6e8e1 and completed successfully .
    69 records added, 3 records removed in the database
    72 pages scraped
  • Manually ran revision 251cc922 and failed .
    nothing changed in the database
    4 pages scraped
  • Manually ran revision 251cc922 and completed successfully .
    56 records added, 3 records removed in the database
    111 pages scraped
  • Manually ran revision 8610b65c and failed .
    3 records added, 3 records removed in the database
    48 pages scraped
  • Manually ran revision 31abcad6 and completed successfully .
    3 records added, 3 records removed in the database
    61 pages scraped
  • Manually ran revision e25a97b8 and completed successfully .
    3 records added in the database
    60 pages scraped
  • Manually ran revision 3f153f31 and failed .
    nothing changed in the database
    2 pages scraped
  • Created on morph.io

Scraper code

Python

sp_E2901_NUA_gov / scraper.py