Injecting configuration and compiling...
[1G [1G-----> Python app detected
[1G ! The latest version of Python 2 is python-2.7.14 (you are using python-2.7.9, which is unsupported).
[1G ! We recommend upgrading by specifying the latest version (python-2.7.14).
[1G Learn More: https://devcenter.heroku.com/articles/python-runtimes
[1G-----> Installing python-2.7.9
[1G-----> Installing pip
[1G-----> Installing requirements with pip
[1G DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. pip 21.0 will drop support for Python 2.7 in January 2021. More details about Python 2 support in pip can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support pip 21.0 will remove support for this functionality.
[1G Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6))
[1G Cloning http://github.com/openaustralia/scraperwiki-python.git (to revision morph_defaults) to /app/.heroku/src/scraperwiki
[1G Running command git clone -q http://github.com/openaustralia/scraperwiki-python.git /app/.heroku/src/scraperwiki
[1G Running command git checkout -b morph_defaults --track origin/morph_defaults
[1G Switched to a new branch 'morph_defaults'
[1G Branch morph_defaults set up to track remote branch morph_defaults from origin.
[1G Collecting lxml==3.4.4
[1G Downloading lxml-3.4.4.tar.gz (3.5 MB)
[1G Collecting cssselect==0.9.1
[1G Downloading cssselect-0.9.1.tar.gz (32 kB)
[1G Collecting Scrapy==1.0.3
[1G Downloading Scrapy-1.0.3-py2-none-any.whl (290 kB)
[1G Collecting dumptruck>=0.1.2
[1G Downloading dumptruck-0.1.6.tar.gz (15 kB)
[1G Collecting requests
[1G Downloading requests-2.27.1-py2.py3-none-any.whl (63 kB)
[1G Collecting service-identity
[1G Downloading service_identity-21.1.0-py2.py3-none-any.whl (12 kB)
[1G Collecting Twisted>=10.0.0
[1G Downloading Twisted-20.3.0-cp27-cp27m-manylinux1_x86_64.whl (3.2 MB)
[1G Collecting queuelib
[1G Downloading queuelib-1.6.1-py2.py3-none-any.whl (12 kB)
[1G Collecting w3lib>=1.8.0
[1G Downloading w3lib-1.22.0-py2.py3-none-any.whl (20 kB)
[1G Collecting pyOpenSSL
[1G Downloading pyOpenSSL-21.0.0-py2.py3-none-any.whl (55 kB)
[1G Collecting six>=1.5.2
[1G Downloading six-1.16.0-py2.py3-none-any.whl (11 kB)
[1G Collecting idna<3,>=2.5; python_version < "3"
[1G Downloading idna-2.10-py2.py3-none-any.whl (58 kB)
[1G Collecting certifi>=2017.4.17
[1G Downloading certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
[1G Collecting chardet<5,>=3.0.2; python_version < "3"
[1G Downloading chardet-4.0.0-py2.py3-none-any.whl (178 kB)
[1G Collecting urllib3<1.27,>=1.21.1
[1G Downloading urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
[1G Collecting pyasn1
[1G Downloading pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
[1G Collecting attrs>=19.1.0
[1G Downloading attrs-21.4.0-py2.py3-none-any.whl (60 kB)
[1G Collecting pyasn1-modules
[1G Downloading pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
[1G Collecting ipaddress; python_version < "3.3"
[1G Downloading ipaddress-1.0.23-py2.py3-none-any.whl (18 kB)
[1G Collecting cryptography
[1G Downloading cryptography-3.3.2-cp27-cp27m-manylinux2010_x86_64.whl (2.6 MB)
[1G Collecting incremental>=16.10.1
[1G Downloading incremental-21.3.0-py2.py3-none-any.whl (15 kB)
[1G Collecting Automat>=0.3.0
[1G Downloading Automat-20.2.0-py2.py3-none-any.whl (31 kB)
[1G Collecting zope.interface>=4.4.2
[1G Downloading zope.interface-5.4.0-cp27-cp27m-manylinux2010_x86_64.whl (247 kB)
[1G Collecting hyperlink>=17.1.1
[1G Downloading hyperlink-21.0.0-py2.py3-none-any.whl (74 kB)
[1G Collecting constantly>=15.1
[1G Downloading constantly-15.1.0-py2.py3-none-any.whl (7.9 kB)
[1G Collecting PyHamcrest!=1.10.0,>=1.9.0
[1G Downloading PyHamcrest-1.10.1.tar.gz (43 kB)
[1G Collecting enum34; python_version < "3"
[1G Downloading enum34-1.1.10-py2-none-any.whl (11 kB)
[1G Collecting cffi>=1.12
[1G Downloading cffi-1.15.0-cp27-cp27m-manylinux1_x86_64.whl (393 kB)
[1G Collecting typing; python_version < "3.5"
[1G Downloading typing-3.10.0.0-py2-none-any.whl (26 kB)
[1G Collecting pycparser
[1G Downloading pycparser-2.21-py2.py3-none-any.whl (118 kB)
[1G Building wheels for collected packages: lxml, cssselect, dumptruck, PyHamcrest
[1G Building wheel for lxml (setup.py): started
[1G Building wheel for lxml (setup.py): still running...
[1G Building wheel for lxml (setup.py): finished with status 'done'
[1G Created wheel for lxml: filename=lxml-3.4.4-cp27-cp27m-linux_x86_64.whl size=2989844 sha256=0c2cffe0082be6e9197b597baffdf1cc466e3c34322d98e75ffa847b3b9f889d
[1G Stored in directory: /tmp/pip-ephem-wheel-cache-vgJvCI/wheels/d6/de/81/11ae6edd05c75aac677e67dd154c85da758ba6f3e8e80e962e
[1G Building wheel for cssselect (setup.py): started
[1G Building wheel for cssselect (setup.py): finished with status 'done'
[1G Created wheel for cssselect: filename=cssselect-0.9.1-py2-none-any.whl size=26992 sha256=072aee63873d24e1b820e664819f77bc0566a06732c81293be174364389f3cc9
[1G Stored in directory: /tmp/pip-ephem-wheel-cache-vgJvCI/wheels/85/fe/00/b94036d8583cec9791d8cda24c184f2d2ac1397822f7f0e8d4
[1G Building wheel for dumptruck (setup.py): started
[1G Building wheel for dumptruck (setup.py): finished with status 'done'
[1G Created wheel for dumptruck: filename=dumptruck-0.1.6-py2-none-any.whl size=11844 sha256=f058193eb322a38a146e0b3045b752be915561b434a55a6c64b1e6075ed68f4f
[1G Stored in directory: /tmp/pip-ephem-wheel-cache-vgJvCI/wheels/dc/75/e9/1e61c4080c73e7bda99614549591f83b53bcc2d682f26fce62
[1G Building wheel for PyHamcrest (setup.py): started
[1G Building wheel for PyHamcrest (setup.py): finished with status 'done'
[1G Created wheel for PyHamcrest: filename=PyHamcrest-1.10.1-py2-none-any.whl size=48898 sha256=555bf6501421242a317be1c01b1320ab0defac64b8c4e27a1fe4f12e14c828eb
[1G Stored in directory: /tmp/pip-ephem-wheel-cache-vgJvCI/wheels/f5/8c/e2/f0cea19d340270166bbfd4a2e9d8b8c132e26ef7e1376a0890
[1G Successfully built lxml cssselect dumptruck PyHamcrest
[1G Installing collected packages: dumptruck, idna, certifi, chardet, urllib3, requests, scraperwiki, lxml, cssselect, pyasn1, attrs, pyasn1-modules, ipaddress, six, enum34, pycparser, cffi, cryptography, service-identity, incremental, Automat, zope.interface, typing, hyperlink, constantly, PyHamcrest, Twisted, queuelib, w3lib, pyOpenSSL, Scrapy
[1G Running setup.py develop for scraperwiki
[1G Successfully installed Automat-20.2.0 PyHamcrest-1.10.1 Scrapy-1.0.3 Twisted-20.3.0 attrs-21.4.0 certifi-2021.10.8 cffi-1.15.0 chardet-4.0.0 constantly-15.1.0 cryptography-3.3.2 cssselect-0.9.1 dumptruck-0.1.6 enum34-1.1.10 hyperlink-21.0.0 idna-2.10 incremental-21.3.0 ipaddress-1.0.23 lxml-3.4.4 pyOpenSSL-21.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 queuelib-1.6.1 requests-2.27.1 scraperwiki service-identity-21.1.0 six-1.16.0 typing-3.10.0.0 urllib3-1.26.9 w3lib-1.22.0 zope.interface-5.4.0
[91mDEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. pip 21.0 will drop support for Python 2.7 in January 2021. More details about Python 2 support in pip can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support pip 21.0 will remove support for this functionality.
[0m
[1G
[1G [1G-----> Discovering process types
[1G Procfile declares types -> scraper
Injecting scraper and running...
/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/crypto.py:14: CryptographyDeprecationWarning: Python 2 is no longer supported by the Python core team. Support for it is now deprecated in cryptography, and will be removed in the next release.
from cryptography import utils, x509
2022-05-19 04:44:30 [scrapy] INFO: Scrapy 1.0.3 started (bot: scrapybot)
2022-05-19 04:44:30 [scrapy] INFO: Optional features available: http11, ssl
2022-05-19 04:44:30 [scrapy] INFO: Overridden settings: {}
2022-05-19 04:44:30 [scrapy] INFO: Enabled extensions: SpiderState, LogStats, TelnetConsole, CoreStats, CloseSpider
Unhandled error in Deferred:
2022-05-19 04:44:30 [twisted] CRITICAL: Unhandled error in Deferred:
Traceback (most recent call last):
File "scraper.py", line 28, in <module>
process.crawl(ConsultationSpider)
File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/crawler.py", line 153, in crawl
d = crawler.crawl(*args, **kwargs)
File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/defer.py", line 1613, in unwindGenerator
return _cancellableInlineCallbacks(gen)
File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/defer.py", line 1529, in _cancellableInlineCallbacks
_inlineCallbacks(None, g, status)
--- <exception caught here> ---
File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/defer.py", line 1418, in _inlineCallbacks
result = g.send(result)
File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/crawler.py", line 71, in crawl
self.engine = self._create_engine()
File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/crawler.py", line 83, in _create_engine
return ExecutionEngine(self, lambda _: self.stop())
File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/engine.py", line 64, in __init__
self.scheduler_cls = load_object(self.settings['SCHEDULER'])
File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/utils/misc.py", line 44, in load_object
mod = import_module(module)
File "/app/.heroku/python/lib/python2.7/importlib/__init__.py", line 37, in import_module
__import__(name)
File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/scheduler.py", line 6, in <module>
from queuelib import PriorityQueue
File "/app/.heroku/python/lib/python2.7/site-packages/queuelib/__init__.py", line 1, in <module>
from queuelib.queue import FifoDiskQueue, LifoDiskQueue
File "/app/.heroku/python/lib/python2.7/site-packages/queuelib/queue.py", line 7, in <module>
from contextlib import suppress
exceptions.ImportError: cannot import name suppress
2022-05-19 04:44:30 [twisted] CRITICAL:
Traceback (most recent call last):
File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/defer.py", line 1418, in _inlineCallbacks
result = g.send(result)
File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/crawler.py", line 71, in crawl
self.engine = self._create_engine()
File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/crawler.py", line 83, in _create_engine
return ExecutionEngine(self, lambda _: self.stop())
File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/engine.py", line 64, in __init__
self.scheduler_cls = load_object(self.settings['SCHEDULER'])
File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/utils/misc.py", line 44, in load_object
mod = import_module(module)
File "/app/.heroku/python/lib/python2.7/importlib/__init__.py", line 37, in import_module
__import__(name)
File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/scheduler.py", line 6, in <module>
from queuelib import PriorityQueue
File "/app/.heroku/python/lib/python2.7/site-packages/queuelib/__init__.py", line 1, in <module>
from queuelib.queue import FifoDiskQueue, LifoDiskQueue
File "/app/.heroku/python/lib/python2.7/site-packages/queuelib/queue.py", line 7, in <module>
from contextlib import suppress
ImportError: cannot import name suppress