Contributors krynens

Last run failed with status code 1.

Console output of last run

Injecting configuration and compiling... [1G [1G-----> Python app detected [1G-----> Installing python-3.6.2 [1G-----> Installing pip [1G-----> Installing requirements with pip [1G Collecting alembic==1.5.5 [1G Downloading alembic-1.5.5.tar.gz (1.2 MB) [1G Collecting beautifulsoup4==4.9.3 [1G Downloading beautifulsoup4-4.9.3-py3-none-any.whl (115 kB) [1G Collecting bs4==0.0.1 [1G Downloading bs4-0.0.1.tar.gz (1.1 kB) [1G Collecting certifi==2020.12.5 [1G Downloading certifi-2020.12.5-py2.py3-none-any.whl (147 kB) [1G Collecting chardet==4.0.0 [1G Downloading chardet-4.0.0-py2.py3-none-any.whl (178 kB) [1G Collecting idna==2.10 [1G Downloading idna-2.10-py2.py3-none-any.whl (58 kB) [1G Collecting lxml==4.6.2 [1G Downloading lxml-4.6.2-cp36-cp36m-manylinux1_x86_64.whl (5.5 MB) [1G Collecting Mako==1.1.4 [1G Downloading Mako-1.1.4.tar.gz (479 kB) [1G Collecting MarkupSafe==1.1.1 [1G Downloading MarkupSafe-1.1.1-cp36-cp36m-manylinux2010_x86_64.whl (32 kB) [1G Collecting python-dateutil==2.8.1 [1G Downloading python_dateutil-2.8.1-py2.py3-none-any.whl (227 kB) [1G Collecting python-editor==1.0.4 [1G Downloading python_editor-1.0.4-py3-none-any.whl (4.9 kB) [1G Collecting requests==2.25.1 [1G Downloading requests-2.25.1-py2.py3-none-any.whl (61 kB) [1G Collecting scraperwiki==0.5.1 [1G Downloading scraperwiki-0.5.1.tar.gz (7.7 kB) [1G Collecting six==1.15.0 [1G Downloading six-1.15.0-py2.py3-none-any.whl (10 kB) [1G Collecting soupsieve==2.2 [1G Downloading soupsieve-2.2-py3-none-any.whl (33 kB) [1G Collecting SQLAlchemy==1.3.23 [1G Downloading SQLAlchemy-1.3.23-cp36-cp36m-manylinux2010_x86_64.whl (1.3 MB) [1G Collecting urllib3==1.26.3 [1G Downloading urllib3-1.26.3-py2.py3-none-any.whl (137 kB) [1G Building wheels for collected packages: alembic, bs4, Mako, scraperwiki [1G Building wheel for alembic (setup.py): started [1G Building wheel for alembic (setup.py): finished with status 'done' [1G Created wheel for alembic: filename=alembic-1.5.5-py2.py3-none-any.whl size=156597 sha256=ccfa9823e2fc4472514c6449650b4c1ea61f9c5cf8c9960a74714f60edb4b2d1 [1G Stored in directory: /tmp/pip-ephem-wheel-cache-t76d4kld/wheels/bf/8b/9e/b7b1ebbb295e2116c7d63b2012dae024bb23b910277889c42f [1G Building wheel for bs4 (setup.py): started [1G Building wheel for bs4 (setup.py): finished with status 'done' [1G Created wheel for bs4: filename=bs4-0.0.1-py3-none-any.whl size=1273 sha256=4c3a7fa13912ee3aaf38bdd5c7a51cdb3bf500f7fbc9393d31d0e8ecf97e3ddb [1G Stored in directory: /tmp/pip-ephem-wheel-cache-t76d4kld/wheels/19/f5/6d/a97dd4f22376d4472d5f4c76c7646876052ff3166b3cf71050 [1G Building wheel for Mako (setup.py): started [1G Building wheel for Mako (setup.py): finished with status 'done' [1G Created wheel for Mako: filename=Mako-1.1.4-py2.py3-none-any.whl size=75675 sha256=f4ef14518c0ffec29bfe5830328002b404d754fa0ad7f07117e556b2f9e6f175 [1G Stored in directory: /tmp/pip-ephem-wheel-cache-t76d4kld/wheels/3c/ee/c2/9651c6b977f9d2a1bb766970d190f71213e2ca47b36d8dc488 [1G Building wheel for scraperwiki (setup.py): started [1G Building wheel for scraperwiki (setup.py): finished with status 'done' [1G Created wheel for scraperwiki: filename=scraperwiki-0.5.1-py3-none-any.whl size=6545 sha256=2429cb6a1b2d4275f81b7c96d80432d634312fa4ef3bc5f28077f1766917ed5c [1G Stored in directory: /tmp/pip-ephem-wheel-cache-t76d4kld/wheels/cd/f8/ac/cd66eb1c557ab40d35c1ed852da3e9b37baa3e21b61906a5cf [1G Successfully built alembic bs4 Mako scraperwiki [1G Installing collected packages: six, MarkupSafe, urllib3, SQLAlchemy, soupsieve, python-editor, python-dateutil, Mako, idna, chardet, certifi, requests, beautifulsoup4, alembic, scraperwiki, lxml, bs4 [1G Successfully installed Mako-1.1.4 MarkupSafe-1.1.1 SQLAlchemy-1.3.23 alembic-1.5.5 beautifulsoup4-4.9.3 bs4-0.0.1 certifi-2020.12.5 chardet-4.0.0 idna-2.10 lxml-4.6.2 python-dateutil-2.8.1 python-editor-1.0.4 requests-2.25.1 scraperwiki-0.5.1 six-1.15.0 soupsieve-2.2 urllib3-1.26.3 [1G [1G [1G-----> Discovering process types [1G Procfile declares types -> scraper Injecting scraper and running... Traceback (most recent call last): File "/app/.heroku/python/lib/python3.6/site-packages/urllib3/connection.py", line 170, in _new_conn (self._dns_host, self.port), self.timeout, **extra_kw File "/app/.heroku/python/lib/python3.6/site-packages/urllib3/util/connection.py", line 96, in create_connection raise err File "/app/.heroku/python/lib/python3.6/site-packages/urllib3/util/connection.py", line 86, in create_connection sock.connect(sa) TimeoutError: [Errno 110] Connection timed out During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/app/.heroku/python/lib/python3.6/site-packages/urllib3/connectionpool.py", line 706, in urlopen chunked=chunked, File "/app/.heroku/python/lib/python3.6/site-packages/urllib3/connectionpool.py", line 382, in _make_request self._validate_conn(conn) File "/app/.heroku/python/lib/python3.6/site-packages/urllib3/connectionpool.py", line 1010, in _validate_conn conn.connect() File "/app/.heroku/python/lib/python3.6/site-packages/urllib3/connection.py", line 353, in connect conn = self._new_conn() File "/app/.heroku/python/lib/python3.6/site-packages/urllib3/connection.py", line 182, in _new_conn self, "Failed to establish a new connection: %s" % e urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7fecd1755f28>: Failed to establish a new connection: [Errno 110] Connection timed out During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/app/.heroku/python/lib/python3.6/site-packages/requests/adapters.py", line 449, in send timeout=timeout File "/app/.heroku/python/lib/python3.6/site-packages/urllib3/connectionpool.py", line 756, in urlopen method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2] File "/app/.heroku/python/lib/python3.6/site-packages/urllib3/util/retry.py", line 573, in increment raise MaxRetryError(_pool, url, error or ResponseError(cause)) urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='www.bawbawshire.vic.gov.au', port=443): Max retries exceeded with url: /Plan-and-Build/Planning-permits/Advertised-Planning-Applications (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fecd1755f28>: Failed to establish a new connection: [Errno 110] Connection timed out',)) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "scraper.py", line 12, in <module> r = requests.get(url) File "/app/.heroku/python/lib/python3.6/site-packages/requests/api.py", line 76, in get return request('get', url, params=params, **kwargs) File "/app/.heroku/python/lib/python3.6/site-packages/requests/api.py", line 61, in request return session.request(method=method, url=url, **kwargs) File "/app/.heroku/python/lib/python3.6/site-packages/requests/sessions.py", line 542, in request resp = self.send(prep, **send_kwargs) File "/app/.heroku/python/lib/python3.6/site-packages/requests/sessions.py", line 655, in send r = adapter.send(request, **kwargs) File "/app/.heroku/python/lib/python3.6/site-packages/requests/adapters.py", line 516, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPSConnectionPool(host='www.bawbawshire.vic.gov.au', port=443): Max retries exceeded with url: /Plan-and-Build/Planning-permits/Advertised-Planning-Applications (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fecd1755f28>: Failed to establish a new connection: [Errno 110] Connection timed out',))

Statistics

Total run time: 3 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 25.2 KB

History

  • Manually ran revision 4c74f037 and failed .
  • Created on morph.io

Scraper code

baw_baw_scraper