PGuyan / AI_Entscheide

Entscheide der Gerichte des Kantons Appenzell Innerrhode


Contributors PGuyan

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-2.7.14 -----> Installing pip -----> Noticed cffi. Bootstrapping libffi. -----> Installing requirements with pip  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to morph_defaults) to /app/.heroku/src/scraperwiki  Collecting pytz==2017.3 (from -r /tmp/build/requirements.txt (line 14))  Downloading pytz-2017.3-py2.py3-none-any.whl (511kB)  Collecting asn1crypto==0.24.0 (from -r /tmp/build/requirements.txt (line 17))  Downloading asn1crypto-0.24.0-py2.py3-none-any.whl (101kB)  Collecting attrs==17.3.0 (from -r /tmp/build/requirements.txt (line 18))  Downloading attrs-17.3.0-py2.py3-none-any.whl  Collecting Automat==0.6.0 (from -r /tmp/build/requirements.txt (line 19))  Downloading Automat-0.6.0-py2.py3-none-any.whl  Collecting beautifulsoup4==4.4.1 (from -r /tmp/build/requirements.txt (line 20))  Downloading beautifulsoup4-4.4.1-py2-none-any.whl (81kB)  Collecting certifi==2017.11.5 (from -r /tmp/build/requirements.txt (line 21))  Downloading certifi-2017.11.5-py2.py3-none-any.whl (330kB)  Collecting cffi==1.11.2 (from -r /tmp/build/requirements.txt (line 22))  Downloading cffi-1.11.2-cp27-cp27mu-manylinux1_x86_64.whl (405kB)  Collecting chardet==3.0.4 (from -r /tmp/build/requirements.txt (line 23))  Downloading chardet-3.0.4-py2.py3-none-any.whl (133kB)  Collecting constantly==15.1.0 (from -r /tmp/build/requirements.txt (line 24))  Downloading constantly-15.1.0-py2.py3-none-any.whl  Collecting cryptography==2.1.4 (from -r /tmp/build/requirements.txt (line 25))  Downloading cryptography-2.1.4-cp27-cp27mu-manylinux1_x86_64.whl (2.2MB)  Collecting cssselect==1.0.1 (from -r /tmp/build/requirements.txt (line 26))  Downloading cssselect-1.0.1-py2.py3-none-any.whl  Collecting enum34==1.1.6 (from -r /tmp/build/requirements.txt (line 27))  Downloading enum34-1.1.6-py2-none-any.whl  Collecting html5lib==0.999 (from -r /tmp/build/requirements.txt (line 28))  Downloading html5lib-0.999.tar.gz (885kB)  Collecting hyperlink==17.3.1 (from -r /tmp/build/requirements.txt (line 29))  Downloading hyperlink-17.3.1-py2.py3-none-any.whl (73kB)  Collecting idna==2.6 (from -r /tmp/build/requirements.txt (line 30))  Downloading idna-2.6-py2.py3-none-any.whl (56kB)  Collecting incremental==17.5.0 (from -r /tmp/build/requirements.txt (line 31))  Downloading incremental-17.5.0-py2.py3-none-any.whl  Collecting ipaddress==1.0.18 (from -r /tmp/build/requirements.txt (line 32))  Downloading ipaddress-1.0.18-py2-none-any.whl  Collecting lxml==4.1.1 (from -r /tmp/build/requirements.txt (line 33))  Downloading lxml-4.1.1-cp27-cp27mu-manylinux1_x86_64.whl (5.6MB)  Collecting mercurial==3.7.3 (from -r /tmp/build/requirements.txt (line 34))  Downloading mercurial-3.7.3.tar.gz (4.6MB)  Collecting parsel==1.2.0 (from -r /tmp/build/requirements.txt (line 35))  Downloading parsel-1.2.0-py2.py3-none-any.whl  Collecting Pillow==3.1.2 (from -r /tmp/build/requirements.txt (line 36))  Downloading Pillow-3.1.2.zip (10.4MB)  Collecting pisa==3.0.32 (from -r /tmp/build/requirements.txt (line 37))  Downloading pisa-3.0.32.tar.gz (4.5MB)  Collecting pyasn1==0.4.2 (from -r /tmp/build/requirements.txt (line 38))  Downloading pyasn1-0.4.2-py2.py3-none-any.whl (71kB)  Collecting pyasn1-modules==0.2.1 (from -r /tmp/build/requirements.txt (line 39))  Downloading pyasn1_modules-0.2.1-py2.py3-none-any.whl (60kB)  Collecting pycparser==2.18 (from -r /tmp/build/requirements.txt (line 40))  Downloading pycparser-2.18.tar.gz (245kB)  Collecting pycurl==7.43.0 (from -r /tmp/build/requirements.txt (line 41))  Downloading pycurl-7.43.0.tar.gz (182kB)  Collecting PyDispatcher==2.0.5 (from -r /tmp/build/requirements.txt (line 42))  Downloading PyDispatcher-2.0.5.tar.gz  Collecting pyOpenSSL==17.5.0 (from -r /tmp/build/requirements.txt (line 43))  Downloading pyOpenSSL-17.5.0-py2.py3-none-any.whl (53kB)  Collecting pyPdf==1.13 (from -r /tmp/build/requirements.txt (line 44))  Downloading pyPdf-1.13.tar.gz  Collecting queuelib==1.4.2 (from -r /tmp/build/requirements.txt (line 46))  Downloading queuelib-1.4.2-py2.py3-none-any.whl  Collecting reportlab==3.3.0 (from -r /tmp/build/requirements.txt (line 47))  Downloading reportlab-3.3.0.tar.gz (2.0MB)  Collecting requests==2.18.4 (from -r /tmp/build/requirements.txt (line 48))  Downloading requests-2.18.4-py2.py3-none-any.whl (88kB)  Collecting Scrapy==1.4.0 (from -r /tmp/build/requirements.txt (line 49))  Downloading Scrapy-1.4.0-py2.py3-none-any.whl (248kB)  Collecting scrapy-splash==0.7.2 (from -r /tmp/build/requirements.txt (line 50))  Downloading scrapy_splash-0.7.2-py2.py3-none-any.whl  Collecting service-identity==17.0.0 (from -r /tmp/build/requirements.txt (line 51))  Downloading service_identity-17.0.0-py2.py3-none-any.whl  Collecting six==1.11.0 (from -r /tmp/build/requirements.txt (line 52))  Downloading six-1.11.0-py2.py3-none-any.whl  Collecting Twisted==17.9.0 (from -r /tmp/build/requirements.txt (line 53))  Downloading Twisted-17.9.0.tar.bz2 (3.0MB)  Collecting urllib3==1.22 (from -r /tmp/build/requirements.txt (line 54))  Downloading urllib3-1.22-py2.py3-none-any.whl (132kB)  Collecting virtualenv==15.1.0 (from -r /tmp/build/requirements.txt (line 55))  Downloading virtualenv-15.1.0-py2.py3-none-any.whl (1.8MB)  Collecting w3lib==1.18.0 (from -r /tmp/build/requirements.txt (line 56))  Downloading w3lib-1.18.0-py2.py3-none-any.whl  Collecting zope.interface==4.4.3 (from -r /tmp/build/requirements.txt (line 57))  Downloading zope.interface-4.4.3-cp27-cp27mu-manylinux1_x86_64.whl (170kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading dumptruck-0.1.6.tar.gz  Installing collected packages: dumptruck, idna, urllib3, certifi, chardet, requests, scraperwiki, pytz, asn1crypto, attrs, six, Automat, beautifulsoup4, pycparser, cffi, constantly, enum34, ipaddress, cryptography, cssselect, html5lib, hyperlink, incremental, lxml, mercurial, w3lib, parsel, Pillow, pisa, pyasn1, pyasn1-modules, pycurl, PyDispatcher, pyOpenSSL, pyPdf, queuelib, reportlab, zope.interface, Twisted, service-identity, Scrapy, scrapy-splash, virtualenv  Running setup.py install for dumptruck: started  Running setup.py install for dumptruck: finished with status 'done'  Running setup.py develop for scraperwiki  Running setup.py install for pycparser: started  Running setup.py install for pycparser: finished with status 'done'  Running setup.py install for html5lib: started  Running setup.py install for html5lib: finished with status 'done'  Running setup.py install for mercurial: started  Running setup.py install for mercurial: finished with status 'done'  Running setup.py install for Pillow: started  Running setup.py install for Pillow: finished with status 'done'  Running setup.py install for pisa: started  Running setup.py install for pisa: finished with status 'done'  Running setup.py install for pycurl: started  Running setup.py install for pycurl: finished with status 'done'  Running setup.py install for PyDispatcher: started  Running setup.py install for PyDispatcher: finished with status 'done'  Running setup.py install for pyPdf: started  Running setup.py install for pyPdf: finished with status 'done'  Running setup.py install for reportlab: started  Running setup.py install for reportlab: finished with status 'done'  Running setup.py install for Twisted: started  Running setup.py install for Twisted: finished with status 'done'  Successfully installed Automat-0.6.0 Pillow-3.1.2 PyDispatcher-2.0.5 Scrapy-1.4.0 Twisted-17.9.0 asn1crypto-0.24.0 attrs-17.3.0 beautifulsoup4-4.4.1 certifi-2017.11.5 cffi-1.11.2 chardet-3.0.4 constantly-15.1.0 cryptography-2.1.4 cssselect-1.0.1 dumptruck-0.1.6 enum34-1.1.6 html5lib-0.999 hyperlink-17.3.1 idna-2.6 incremental-17.5.0 ipaddress-1.0.18 lxml-4.1.1 mercurial-3.7.3 parsel-1.2.0 pisa-3.0.32 pyOpenSSL-17.5.0 pyPdf-1.13 pyasn1-0.4.2 pyasn1-modules-0.2.1 pycparser-2.18 pycurl-7.43.0 pytz-2017.3 queuelib-1.4.2 reportlab-3.3.0 requests-2.18.4 scraperwiki scrapy-splash-0.7.2 service-identity-17.0.0 six-1.11.0 urllib3-1.22 virtualenv-15.1.0 w3lib-1.18.0 zope.interface-4.4.3   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... 2017-12-15 18:40:49 [scrapy.utils.log] INFO: Scrapy 1.4.0 started (bot: scrapybot) 2017-12-15 18:40:49 [scrapy.utils.log] INFO: Overridden settings: {} 2017-12-15 18:40:49 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.logstats.LogStats', 'scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole'] 2017-12-15 18:40:50 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2017-12-15 18:40:50 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2017-12-15 18:40:50 [scrapy.middleware] INFO: Enabled item pipelines: [] 2017-12-15 18:40:50 [scrapy.core.engine] INFO: Spider opened 2017-12-15 18:40:50 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2017-12-15 18:40:50 [scrapy.extensions.telnet] DEBUG: Telnet console listening on 127.0.0.1:6023 Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'. 2017-12-15 18:40:51 [twisted] CRITICAL: Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'. 2017-12-15 18:40:51 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET https://www.ai.ch/themen/staat-und-recht/veroeffentlichungen/verwaltungs-und-gerichtsentscheide> (failed 1 times): [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'.>] Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'. 2017-12-15 18:40:53 [twisted] CRITICAL: Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'. 2017-12-15 18:40:53 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET https://www.ai.ch/themen/staat-und-recht/veroeffentlichungen/verwaltungs-und-gerichtsentscheide> (failed 2 times): [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'.>] Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'. 2017-12-15 18:40:54 [twisted] CRITICAL: Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'. 2017-12-15 18:40:54 [scrapy.downloadermiddlewares.retry] DEBUG: Gave up retrying <GET https://www.ai.ch/themen/staat-und-recht/veroeffentlichungen/verwaltungs-und-gerichtsentscheide> (failed 3 times): [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'.>] 2017-12-15 18:40:54 [scrapy.core.scraper] ERROR: Error downloading <GET https://www.ai.ch/themen/staat-und-recht/veroeffentlichungen/verwaltungs-und-gerichtsentscheide>: [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern '159.100.250.129'.>] 2017-12-15 18:40:54 [scrapy.core.engine] INFO: Closing spider (finished) 2017-12-15 18:40:54 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/exception_count': 3, 'downloader/exception_type_count/twisted.web._newclient.ResponseNeverReceived': 3, 'downloader/request_bytes': 852, 'downloader/request_count': 3, 'downloader/request_method_count/GET': 3, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2017, 12, 15, 18, 40, 54, 415554), 'log_count/CRITICAL': 3, 'log_count/DEBUG': 4, 'log_count/ERROR': 1, 'log_count/INFO': 7, 'memusage/max': 47861760, 'memusage/startup': 47861760, 'retry/count': 2, 'retry/max_reached': 1, 'retry/reason_count/twisted.web._newclient.ResponseNeverReceived': 2, 'scheduler/dequeued': 3, 'scheduler/dequeued/memory': 3, 'scheduler/enqueued': 3, 'scheduler/enqueued/memory': 3, 'start_time': datetime.datetime(2017, 12, 15, 18, 40, 50, 71348)} 2017-12-15 18:40:54 [scrapy.core.engine] INFO: Spider closed (finished)

Statistics

Average successful run time: 1 minute

Total run time: 41 minutes

Total cpu time used: half a minute

Total disk space used: 126 KB

History

  • Manually ran revision 186bb3d4 and completed successfully .
    nothing changed in the database
  • Manually ran revision 186bb3d4 and completed successfully .
    nothing changed in the database
  • Manually ran revision b2d94594 and completed successfully .
    nothing changed in the database
  • Manually ran revision 0c349729 and completed successfully .
    nothing changed in the database
  • Manually ran revision 3f9d0f25 and failed .
    nothing changed in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

AI_Entscheide