PGuyan / AI_Entscheide

Entscheide der Gerichte des Kantons Appenzell Innerrhode


Contributors PGuyan

Last run completed successfully .

Console output of last run

Injecting configuration and compiling... [1G [1G-----> Python app detected [1G-----> Installing python-2.7.14 [1G-----> Installing pip [1G-----> Noticed cffi. Bootstrapping libffi. [1G-----> Installing requirements with pip [1G Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6)) [1G Cloning http://github.com/openaustralia/scraperwiki-python.git (to morph_defaults) to /app/.heroku/src/scraperwiki [1G Collecting pytz==2017.3 (from -r /tmp/build/requirements.txt (line 14)) [1G Downloading pytz-2017.3-py2.py3-none-any.whl (511kB) [1G Collecting asn1crypto==0.24.0 (from -r /tmp/build/requirements.txt (line 17)) [1G Downloading asn1crypto-0.24.0-py2.py3-none-any.whl (101kB) [1G Collecting attrs==17.3.0 (from -r /tmp/build/requirements.txt (line 18)) [1G Downloading attrs-17.3.0-py2.py3-none-any.whl [1G Collecting Automat==0.6.0 (from -r /tmp/build/requirements.txt (line 19)) [1G Downloading Automat-0.6.0-py2.py3-none-any.whl [1G Collecting beautifulsoup4==4.4.1 (from -r /tmp/build/requirements.txt (line 20)) [1G Downloading beautifulsoup4-4.4.1-py2-none-any.whl (81kB) [1G Collecting certifi==2017.11.5 (from -r /tmp/build/requirements.txt (line 21)) [1G Downloading certifi-2017.11.5-py2.py3-none-any.whl (330kB) [1G Collecting cffi==1.11.2 (from -r /tmp/build/requirements.txt (line 22)) [1G Downloading cffi-1.11.2-cp27-cp27mu-manylinux1_x86_64.whl (405kB) [1G Collecting chardet==3.0.4 (from -r /tmp/build/requirements.txt (line 23)) [1G Downloading chardet-3.0.4-py2.py3-none-any.whl (133kB) [1G Collecting constantly==15.1.0 (from -r /tmp/build/requirements.txt (line 24)) [1G Downloading constantly-15.1.0-py2.py3-none-any.whl [1G Collecting cryptography==2.1.4 (from -r /tmp/build/requirements.txt (line 25)) [1G Downloading cryptography-2.1.4-cp27-cp27mu-manylinux1_x86_64.whl (2.2MB) [1G Collecting cssselect==1.0.1 (from -r /tmp/build/requirements.txt (line 26)) [1G Downloading cssselect-1.0.1-py2.py3-none-any.whl [1G Collecting enum34==1.1.6 (from -r /tmp/build/requirements.txt (line 27)) [1G Downloading enum34-1.1.6-py2-none-any.whl [1G Collecting html5lib==0.999 (from -r /tmp/build/requirements.txt (line 28)) [1G Downloading html5lib-0.999.tar.gz (885kB) [1G Collecting hyperlink==17.3.1 (from -r /tmp/build/requirements.txt (line 29)) [1G Downloading hyperlink-17.3.1-py2.py3-none-any.whl (73kB) [1G Collecting idna==2.6 (from -r /tmp/build/requirements.txt (line 30)) [1G Downloading idna-2.6-py2.py3-none-any.whl (56kB) [1G Collecting incremental==17.5.0 (from -r /tmp/build/requirements.txt (line 31)) [1G Downloading incremental-17.5.0-py2.py3-none-any.whl [1G Collecting ipaddress==1.0.18 (from -r /tmp/build/requirements.txt (line 32)) [1G Downloading ipaddress-1.0.18-py2-none-any.whl [1G Collecting lxml==4.1.1 (from -r /tmp/build/requirements.txt (line 33)) [1G Downloading lxml-4.1.1-cp27-cp27mu-manylinux1_x86_64.whl (5.6MB) [1G Collecting mercurial==3.7.3 (from -r /tmp/build/requirements.txt (line 34)) [1G Downloading mercurial-3.7.3.tar.gz (4.6MB) [1G Collecting parsel==1.2.0 (from -r /tmp/build/requirements.txt (line 35)) [1G Downloading parsel-1.2.0-py2.py3-none-any.whl [1G Collecting Pillow==3.1.2 (from -r /tmp/build/requirements.txt (line 36)) [1G Downloading Pillow-3.1.2.zip (10.4MB) [1G Collecting pisa==3.0.32 (from -r /tmp/build/requirements.txt (line 37)) [1G Downloading pisa-3.0.32.tar.gz (4.5MB) [1G Collecting pyasn1==0.4.2 (from -r /tmp/build/requirements.txt (line 38)) [1G Downloading pyasn1-0.4.2-py2.py3-none-any.whl (71kB) [1G Collecting pyasn1-modules==0.2.1 (from -r /tmp/build/requirements.txt (line 39)) [1G Downloading pyasn1_modules-0.2.1-py2.py3-none-any.whl (60kB) [1G Collecting pycparser==2.18 (from -r /tmp/build/requirements.txt (line 40)) [1G Downloading pycparser-2.18.tar.gz (245kB) [1G Collecting pycurl==7.43.0 (from -r /tmp/build/requirements.txt (line 41)) [1G Downloading pycurl-7.43.0.tar.gz (182kB) [1G Collecting PyDispatcher==2.0.5 (from -r /tmp/build/requirements.txt (line 42)) [1G Downloading PyDispatcher-2.0.5.tar.gz [1G Collecting pyOpenSSL==17.5.0 (from -r /tmp/build/requirements.txt (line 43)) [1G Downloading pyOpenSSL-17.5.0-py2.py3-none-any.whl (53kB) [1G Collecting pyPdf==1.13 (from -r /tmp/build/requirements.txt (line 44)) [1G Downloading pyPdf-1.13.tar.gz [1G Collecting queuelib==1.4.2 (from -r /tmp/build/requirements.txt (line 46)) [1G Downloading queuelib-1.4.2-py2.py3-none-any.whl [1G Collecting reportlab==3.3.0 (from -r /tmp/build/requirements.txt (line 47)) [1G Downloading reportlab-3.3.0.tar.gz (2.0MB) [1G Collecting requests==2.18.4 (from -r /tmp/build/requirements.txt (line 48)) [1G Downloading requests-2.18.4-py2.py3-none-any.whl (88kB) [1G Collecting Scrapy==1.4.0 (from -r /tmp/build/requirements.txt (line 49)) [1G Downloading Scrapy-1.4.0-py2.py3-none-any.whl (248kB) [1G Collecting scrapy-splash==0.7.2 (from -r /tmp/build/requirements.txt (line 50)) [1G Downloading scrapy_splash-0.7.2-py2.py3-none-any.whl [1G Collecting service-identity==17.0.0 (from -r /tmp/build/requirements.txt (line 51)) [1G Downloading service_identity-17.0.0-py2.py3-none-any.whl [1G Collecting six==1.11.0 (from -r /tmp/build/requirements.txt (line 52)) [1G Downloading six-1.11.0-py2.py3-none-any.whl [1G Collecting Twisted==17.9.0 (from -r /tmp/build/requirements.txt (line 53)) [1G Downloading Twisted-17.9.0.tar.bz2 (3.0MB) [1G Collecting urllib3==1.22 (from -r /tmp/build/requirements.txt (line 54)) [1G Downloading urllib3-1.22-py2.py3-none-any.whl (132kB) [1G Collecting virtualenv==15.1.0 (from -r /tmp/build/requirements.txt (line 55)) [1G Downloading virtualenv-15.1.0-py2.py3-none-any.whl (1.8MB) [1G Collecting w3lib==1.18.0 (from -r /tmp/build/requirements.txt (line 56)) [1G Downloading w3lib-1.18.0-py2.py3-none-any.whl [1G Collecting zope.interface==4.4.3 (from -r /tmp/build/requirements.txt (line 57)) [1G Downloading zope.interface-4.4.3-cp27-cp27mu-manylinux1_x86_64.whl (170kB) [1G Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading dumptruck-0.1.6.tar.gz [1G Installing collected packages: dumptruck, idna, urllib3, certifi, chardet, requests, scraperwiki, pytz, asn1crypto, attrs, six, Automat, beautifulsoup4, pycparser, cffi, constantly, enum34, ipaddress, cryptography, cssselect, html5lib, hyperlink, incremental, lxml, mercurial, w3lib, parsel, Pillow, pisa, pyasn1, pyasn1-modules, pycurl, PyDispatcher, pyOpenSSL, pyPdf, queuelib, reportlab, zope.interface, Twisted, service-identity, Scrapy, scrapy-splash, virtualenv [1G Running setup.py install for dumptruck: started [1G Running setup.py install for dumptruck: finished with status 'done' [1G Running setup.py develop for scraperwiki [1G Running setup.py install for pycparser: started [1G Running setup.py install for pycparser: finished with status 'done' [1G Running setup.py install for html5lib: started [1G Running setup.py install for html5lib: finished with status 'done' [1G Running setup.py install for mercurial: started [1G Running setup.py install for mercurial: finished with status 'done' [1G Running setup.py install for Pillow: started [1G Running setup.py install for Pillow: finished with status 'done' [1G Running setup.py install for pisa: started [1G Running setup.py install for pisa: finished with status 'done' [1G Running setup.py install for pycurl: started [1G Running setup.py install for pycurl: finished with status 'done' [1G Running setup.py install for PyDispatcher: started [1G Running setup.py install for PyDispatcher: finished with status 'done' [1G Running setup.py install for pyPdf: started [1G Running setup.py install for pyPdf: finished with status 'done' [1G Running setup.py install for reportlab: started [1G Running setup.py install for reportlab: finished with status 'done' [1G Running setup.py install for Twisted: started [1G Running setup.py install for Twisted: finished with status 'done' [1G Successfully installed Automat-0.6.0 Pillow-3.1.2 PyDispatcher-2.0.5 Scrapy-1.4.0 Twisted-17.9.0 asn1crypto-0.24.0 attrs-17.3.0 beautifulsoup4-4.4.1 certifi-2017.11.5 cffi-1.11.2 chardet-3.0.4 constantly-15.1.0 cryptography-2.1.4 cssselect-1.0.1 dumptruck-0.1.6 enum34-1.1.6 html5lib-0.999 hyperlink-17.3.1 idna-2.6 incremental-17.5.0 ipaddress-1.0.18 lxml-4.1.1 mercurial-3.7.3 parsel-1.2.0 pisa-3.0.32 pyOpenSSL-17.5.0 pyPdf-1.13 pyasn1-0.4.2 pyasn1-modules-0.2.1 pycparser-2.18 pycurl-7.43.0 pytz-2017.3 queuelib-1.4.2 reportlab-3.3.0 requests-2.18.4 scraperwiki scrapy-splash-0.7.2 service-identity-17.0.0 six-1.11.0 urllib3-1.22 virtualenv-15.1.0 w3lib-1.18.0 zope.interface-4.4.3 [1G [1G [1G-----> Discovering process types [1G Procfile declares types -> scraper Injecting scraper and running... 2017-12-15 18:40:49 [scrapy.utils.log] INFO: Scrapy 1.4.0 started (bot: scrapybot) 2017-12-15 18:40:49 [scrapy.utils.log] INFO: Overridden settings: {} 2017-12-15 18:40:49 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.logstats.LogStats', 'scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole'] 2017-12-15 18:40:50 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2017-12-15 18:40:50 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2017-12-15 18:40:50 [scrapy.middleware] INFO: Enabled item pipelines: [] 2017-12-15 18:40:50 [scrapy.core.engine] INFO: Spider opened 2017-12-15 18:40:50 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2017-12-15 18:40:50 [scrapy.extensions.telnet] DEBUG: Telnet console listening on 127.0.0.1:6023 Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'. 2017-12-15 18:40:51 [twisted] CRITICAL: Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'. 2017-12-15 18:40:51 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET https://www.ai.ch/themen/staat-und-recht/veroeffentlichungen/verwaltungs-und-gerichtsentscheide> (failed 1 times): [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'.>] Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'. 2017-12-15 18:40:53 [twisted] CRITICAL: Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'. 2017-12-15 18:40:53 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET https://www.ai.ch/themen/staat-und-recht/veroeffentlichungen/verwaltungs-und-gerichtsentscheide> (failed 2 times): [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'.>] Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'. 2017-12-15 18:40:54 [twisted] CRITICAL: Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'. 2017-12-15 18:40:54 [scrapy.downloadermiddlewares.retry] DEBUG: Gave up retrying <GET https://www.ai.ch/themen/staat-und-recht/veroeffentlichungen/verwaltungs-und-gerichtsentscheide> (failed 3 times): [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'.>] 2017-12-15 18:40:54 [scrapy.core.scraper] ERROR: Error downloading <GET https://www.ai.ch/themen/staat-und-recht/veroeffentlichungen/verwaltungs-und-gerichtsentscheide>: [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'.>] 2017-12-15 18:40:54 [scrapy.core.engine] INFO: Closing spider (finished) 2017-12-15 18:40:54 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/exception_count': 3, 'downloader/exception_type_count/twisted.web._newclient.ResponseNeverReceived': 3, 'downloader/request_bytes': 852, 'downloader/request_count': 3, 'downloader/request_method_count/GET': 3, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2017, 12, 15, 18, 40, 54, 415554), 'log_count/CRITICAL': 3, 'log_count/DEBUG': 4, 'log_count/ERROR': 1, 'log_count/INFO': 7, 'memusage/max': 47861760, 'memusage/startup': 47861760, 'retry/count': 2, 'retry/max_reached': 1, 'retry/reason_count/twisted.web._newclient.ResponseNeverReceived': 2, 'scheduler/dequeued': 3, 'scheduler/dequeued/memory': 3, 'scheduler/enqueued': 3, 'scheduler/enqueued/memory': 3, 'start_time': datetime.datetime(2017, 12, 15, 18, 40, 50, 71348)} 2017-12-15 18:40:54 [scrapy.core.engine] INFO: Spider closed (finished)

Statistics

Average successful run time: 1 minute

Total run time: 41 minutes

Total cpu time used: half a minute

Total disk space used: 126 KB

History

  • Manually ran revision 186bb3d4 and completed successfully .
    nothing changed in the database
  • Manually ran revision 186bb3d4 and completed successfully .
    nothing changed in the database
  • Manually ran revision b2d94594 and completed successfully .
    nothing changed in the database
  • Manually ran revision 0c349729 and completed successfully .
    nothing changed in the database
  • Manually ran revision 3f9d0f25 and failed .
    nothing changed in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

AI_Entscheide