Contributors blablupcom woodbine

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected  ! The latest version of Python 2 is python-2.7.14 (you are using python-2.7.6, which is unsupported).  ! We recommend upgrading by specifying the latest version (python-2.7.14).  Learn More: https://devcenter.heroku.com/articles/python-runtimes -----> Installing python-2.7.6 -----> Installing pip -----> Installing requirements with pip  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 1))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to morph_defaults) to /app/.heroku/src/scraperwiki  Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 2))  /app/.heroku/python/lib/python2.7/site-packages/pip/_vendor/requests/packages/urllib3/util/ssl_.py:318: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/security.html#snimissingwarning.  SNIMissingWarning  /app/.heroku/python/lib/python2.7/site-packages/pip/_vendor/requests/packages/urllib3/util/ssl_.py:122: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/security.html#insecureplatformwarning.  InsecurePlatformWarning  Downloading lxml-3.4.4.tar.gz (3.5MB)  Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 3))  Downloading cssselect-0.9.1.tar.gz  Collecting beautifulsoup4 (from -r /tmp/build/requirements.txt (line 4))  Downloading beautifulsoup4-4.6.0-py2-none-any.whl (86kB)  Collecting python-dateutil (from -r /tmp/build/requirements.txt (line 5))  Downloading python_dateutil-2.6.1-py2.py3-none-any.whl (194kB)  Collecting requests[security] (from -r /tmp/build/requirements.txt (line 6))  Downloading requests-2.18.4-py2.py3-none-any.whl (88kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading dumptruck-0.1.6.tar.gz  Collecting six>=1.5 (from python-dateutil->-r /tmp/build/requirements.txt (line 5))  Downloading six-1.11.0-py2.py3-none-any.whl  Collecting chardet<3.1.0,>=3.0.2 (from requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading chardet-3.0.4-py2.py3-none-any.whl (133kB)  Collecting certifi>=2017.4.17 (from requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading certifi-2018.1.18-py2.py3-none-any.whl (151kB)  Collecting urllib3<1.23,>=1.21.1 (from requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading urllib3-1.22-py2.py3-none-any.whl (132kB)  Collecting idna<2.7,>=2.5 (from requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading idna-2.6-py2.py3-none-any.whl (56kB)  Collecting cryptography>=1.3.4; extra == "security" (from requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading cryptography-2.1.4-cp27-cp27m-manylinux1_x86_64.whl (2.2MB)  Collecting pyOpenSSL>=0.14; extra == "security" (from requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading pyOpenSSL-17.5.0-py2.py3-none-any.whl (53kB)  Collecting cffi>=1.7; platform_python_implementation != "PyPy" (from cryptography>=1.3.4; extra == "security"->requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading cffi-1.11.4-cp27-cp27m-manylinux1_x86_64.whl (407kB)  Collecting enum34; python_version < "3" (from cryptography>=1.3.4; extra == "security"->requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading enum34-1.1.6-py2-none-any.whl  Collecting ipaddress; python_version < "3" (from cryptography>=1.3.4; extra == "security"->requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading ipaddress-1.0.19.tar.gz  Collecting asn1crypto>=0.21.0 (from cryptography>=1.3.4; extra == "security"->requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading asn1crypto-0.24.0-py2.py3-none-any.whl (101kB)  Collecting pycparser (from cffi>=1.7; platform_python_implementation != "PyPy"->cryptography>=1.3.4; extra == "security"->requests[security]->-r /tmp/build/requirements.txt (line 6))  Downloading pycparser-2.18.tar.gz (245kB)  Installing collected packages: dumptruck, chardet, certifi, urllib3, idna, pycparser, cffi, enum34, six, ipaddress, asn1crypto, cryptography, pyOpenSSL, requests, scraperwiki, lxml, cssselect, beautifulsoup4, python-dateutil  Running setup.py install for dumptruck: started  Running setup.py install for dumptruck: finished with status 'done'  Running setup.py install for pycparser: started  Running setup.py install for pycparser: finished with status 'done'  Running setup.py install for ipaddress: started  Running setup.py install for ipaddress: finished with status 'done'  Running setup.py develop for scraperwiki  Running setup.py install for lxml: started  Running setup.py install for lxml: still running...  Running setup.py install for lxml: finished with status 'done'  Running setup.py install for cssselect: started  Running setup.py install for cssselect: finished with status 'done'  Successfully installed asn1crypto-0.24.0 beautifulsoup4-4.6.0 certifi-2018.1.18 cffi-1.11.4 chardet-3.0.4 cryptography-2.1.4 cssselect-0.9.1 dumptruck-0.1.6 enum34-1.1.6 idna-2.6 ipaddress-1.0.19 lxml-3.4.4 pyOpenSSL-17.5.0 pycparser-2.18 python-dateutil-2.6.1 requests-2.18.4 scraperwiki six-1.11.0 urllib3-1.22   ! Hello! It looks like your application is using an outdated version of Python.  ! This caused the security warning you saw above during the 'pip install' step.  ! We recommend 'python-3.6.2', which you can specify in a 'runtime.txt' file.  ! -- Much Love, Heroku.   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files the scraper needs POST requests to get the spending files

Data

Downloaded 166 times by SimKennedy MikeRalphson

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (19 KB) Use the API

rows 10 / 85

f l d
E4303_SHMBC_gov_2017_12
2018-02-20 02:54:14.798930
E4303_SHMBC_gov_2017_11
2018-02-20 02:54:21.287933
E4303_SHMBC_gov_2017_10
2018-02-20 02:54:27.614510
E4303_SHMBC_gov_2017_09
2018-02-20 02:54:34.731292
E4303_SHMBC_gov_2017_08
2018-02-20 02:54:40.853506
E4303_SHMBC_gov_2017_07
2018-02-20 02:54:47.012204
E4303_SHMBC_gov_2017_06
2018-02-20 02:54:53.455636
E4303_SHMBC_gov_2017_05
2018-02-20 02:54:59.744996
E4303_SHMBC_gov_2017_04
2018-02-20 02:55:06.683804
E4303_SHMBC_gov_2010_12
2018-02-20 02:55:13.294602

Statistics

Average successful run time: 5 minutes

Total run time: 24 days

Total cpu time used: 34 minutes

Total disk space used: 48 KB

History

  • Auto ran revision 56fa77a2 and completed successfully .
    85 records added, 85 records removed in the database
  • Auto ran revision 56fa77a2 and completed successfully .
    85 records added, 85 records removed in the database
  • Auto ran revision 56fa77a2 and failed .
    nothing changed in the database
  • Auto ran revision 56fa77a2 and completed successfully .
    85 records added, 85 records removed in the database
    95 pages scraped
  • Auto ran revision 56fa77a2 and completed successfully .
    85 records added, 85 records removed in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

Python

sp_E4303_SHMBC_gov / scraper.py