This is a scraper that runs on Morph. To get started see the documentation

Contributors woodbine blablupcom

Last run failed with status code 1.

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-2.7.12  $ pip install -r requirements.txt  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r requirements.txt (line 1))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to morph_defaults) to ./.heroku/src/scraperwiki  Collecting requests==2.11.1 (from -r requirements.txt (line 3))  Downloading requests-2.11.1-py2.py3-none-any.whl (514kB)  Collecting lxml==3.4.4 (from -r requirements.txt (line 4))  Downloading lxml-3.4.4.tar.gz (3.5MB)  Collecting cssselect==0.9.1 (from -r requirements.txt (line 5))  Downloading cssselect-0.9.1.tar.gz  Collecting pyopenssl==16.2.0 (from -r requirements.txt (line 6))  Downloading pyOpenSSL-16.2.0-py2.py3-none-any.whl (43kB)  Collecting ndg-httpsclient (from -r requirements.txt (line 7))  Downloading ndg_httpsclient-0.4.2.tar.gz  Collecting pyasn1 (from -r requirements.txt (line 8))  Downloading pyasn1-0.1.9-py2.py3-none-any.whl  Collecting beautifulsoup4 (from -r requirements.txt (line 9))  Downloading beautifulsoup4-4.5.1-py2-none-any.whl (83kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r requirements.txt (line 1))  Downloading dumptruck-0.1.6.tar.gz  Collecting cryptography>=1.3.4 (from pyopenssl==16.2.0->-r requirements.txt (line 6))  Downloading cryptography-1.5.2.tar.gz (400kB)  Collecting six>=1.5.2 (from pyopenssl==16.2.0->-r requirements.txt (line 6))  Downloading six-1.10.0-py2.py3-none-any.whl  Collecting idna>=2.0 (from cryptography>=1.3.4->pyopenssl==16.2.0->-r requirements.txt (line 6))  Downloading idna-2.1-py2.py3-none-any.whl (54kB)  Collecting enum34 (from cryptography>=1.3.4->pyopenssl==16.2.0->-r requirements.txt (line 6))  Downloading enum34-1.1.6-py2-none-any.whl  Collecting ipaddress (from cryptography>=1.3.4->pyopenssl==16.2.0->-r requirements.txt (line 6))  Downloading ipaddress-1.0.17-py2-none-any.whl  Collecting cffi>=1.4.1 (from cryptography>=1.3.4->pyopenssl==16.2.0->-r requirements.txt (line 6))  Downloading cffi-1.8.3-cp27-cp27m-manylinux1_x86_64.whl (388kB)  Collecting pycparser (from cffi>=1.4.1->cryptography>=1.3.4->pyopenssl==16.2.0->-r requirements.txt (line 6))  Downloading pycparser-2.16.tar.gz (230kB)  Installing collected packages: dumptruck, requests, scraperwiki, lxml, cssselect, idna, pyasn1, six, enum34, ipaddress, pycparser, cffi, cryptography, pyopenssl, ndg-httpsclient, beautifulsoup4  Running setup.py install for dumptruck: started  Running setup.py install for dumptruck: finished with status 'done'  Running setup.py develop for scraperwiki  Running setup.py install for lxml: started  Running setup.py install for lxml: still running...  Running setup.py install for lxml: finished with status 'done'  Running setup.py install for cssselect: started  Running setup.py install for cssselect: finished with status 'done'  Running setup.py install for pycparser: started  Running setup.py install for pycparser: finished with status 'done'  Running setup.py install for cryptography: started  Running setup.py install for cryptography: finished with status 'done'  Running setup.py install for ndg-httpsclient: started  Running setup.py install for ndg-httpsclient: finished with status 'done'  Successfully installed beautifulsoup4-4.5.1 cffi-1.8.3 cryptography-1.5.2 cssselect-0.9.1 dumptruck-0.1.6 enum34-1.1.6 idna-2.1 ipaddress-1.0.17 lxml-3.4.4 ndg-httpsclient-0.4.2 pyasn1-0.1.9 pycparser-2.16 pyopenssl-16.2.0 requests-2.11.1 scraperwiki six-1.10.0   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... Traceback (most recent call last): File "scraper.py", line 103, in <module> html = requests.get(urls) File "/app/.heroku/python/lib/python2.7/site-packages/requests/api.py", line 70, in get return request('get', url, params=params, **kwargs) File "/app/.heroku/python/lib/python2.7/site-packages/requests/api.py", line 56, in request return session.request(method=method, url=url, **kwargs) File "/app/.heroku/python/lib/python2.7/site-packages/requests/sessions.py", line 475, in request resp = self.send(prep, **send_kwargs) File "/app/.heroku/python/lib/python2.7/site-packages/requests/sessions.py", line 596, in send r = adapter.send(request, **kwargs) File "/app/.heroku/python/lib/python2.7/site-packages/requests/adapters.py", line 497, in send raise SSLError(e, request=request) requests.exceptions.SSLError: ("bad handshake: Error([('SSL routines', 'SSL23_GET_SERVER_HELLO', 'unknown protocol')],)",)

Statistics

Total run time: 5 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 37.6 KB

History

  • Manually ran revision 299f804c and failed .
    nothing changed in the database
  • Manually ran revision 299f804c and failed .
    nothing changed in the database
    1 page scraped
  • Manually ran revision 299f804c and failed .
    nothing changed in the database
    1 page scraped
  • Manually ran revision dbb912c2 and failed .
  • Created on morph.io

Scraper code

Python

sp_E5037_ELBC_gov / scraper.py