PGuyan / LU_morphio

Luzerner Entscheid


Contributors PGuyan

Last run completed successfully .

Console output of last run

Injecting configuration and compiling... [1G [1G-----> Python app detected [1G-----> Installing python-2.7.14 [1G-----> Installing pip [1G-----> Installing requirements with pip [1G Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6)) [1G Cloning http://github.com/openaustralia/scraperwiki-python.git (to morph_defaults) to /app/.heroku/src/scraperwiki [1G Collecting scrapy==1.4.0 (from -r /tmp/build/requirements.txt (line 8)) [1G Downloading Scrapy-1.4.0-py2.py3-none-any.whl (248kB) [1G Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 9)) [1G Downloading lxml-3.4.4.tar.gz (3.5MB) [1G Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 10)) [1G Downloading cssselect-0.9.1.tar.gz [1G Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading dumptruck-0.1.6.tar.gz [1G Collecting requests (from scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading requests-2.18.4-py2.py3-none-any.whl (88kB) [1G Collecting PyDispatcher>=2.0.5 (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading PyDispatcher-2.0.5.tar.gz [1G Collecting Twisted>=13.1.0 (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading Twisted-17.9.0.tar.bz2 (3.0MB) [1G Collecting pyOpenSSL (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading pyOpenSSL-17.5.0-py2.py3-none-any.whl (53kB) [1G Collecting queuelib (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading queuelib-1.4.2-py2.py3-none-any.whl [1G Collecting parsel>=1.1 (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading parsel-1.2.0-py2.py3-none-any.whl [1G Collecting service-identity (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading service_identity-17.0.0-py2.py3-none-any.whl [1G Collecting six>=1.5.2 (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading six-1.11.0-py2.py3-none-any.whl [1G Collecting w3lib>=1.17.0 (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading w3lib-1.18.0-py2.py3-none-any.whl [1G Collecting idna<2.7,>=2.5 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading idna-2.6-py2.py3-none-any.whl (56kB) [1G Collecting urllib3<1.23,>=1.21.1 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading urllib3-1.22-py2.py3-none-any.whl (132kB) [1G Collecting certifi>=2017.4.17 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading certifi-2017.11.5-py2.py3-none-any.whl (330kB) [1G Collecting chardet<3.1.0,>=3.0.2 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading chardet-3.0.4-py2.py3-none-any.whl (133kB) [1G Collecting zope.interface>=3.6.0 (from Twisted>=13.1.0->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading zope.interface-4.4.3-cp27-cp27mu-manylinux1_x86_64.whl (170kB) [1G Collecting constantly>=15.1 (from Twisted>=13.1.0->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading constantly-15.1.0-py2.py3-none-any.whl [1G Collecting incremental>=16.10.1 (from Twisted>=13.1.0->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading incremental-17.5.0-py2.py3-none-any.whl [1G Collecting Automat>=0.3.0 (from Twisted>=13.1.0->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading Automat-0.6.0-py2.py3-none-any.whl [1G Collecting hyperlink>=17.1.1 (from Twisted>=13.1.0->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading hyperlink-17.3.1-py2.py3-none-any.whl (73kB) [1G Collecting cryptography>=2.1.4 (from pyOpenSSL->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading cryptography-2.1.4-cp27-cp27mu-manylinux1_x86_64.whl (2.2MB) [1G Collecting pyasn1 (from service-identity->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading pyasn1-0.4.2-py2.py3-none-any.whl (71kB) [1G Collecting pyasn1-modules (from service-identity->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading pyasn1_modules-0.2.1-py2.py3-none-any.whl (60kB) [1G Collecting attrs (from service-identity->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading attrs-17.3.0-py2.py3-none-any.whl [1G Collecting cffi>=1.7; platform_python_implementation != "PyPy" (from cryptography>=2.1.4->pyOpenSSL->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading cffi-1.11.2-cp27-cp27mu-manylinux1_x86_64.whl (405kB) [1G Collecting enum34; python_version < "3" (from cryptography>=2.1.4->pyOpenSSL->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading enum34-1.1.6-py2-none-any.whl [1G Collecting asn1crypto>=0.21.0 (from cryptography>=2.1.4->pyOpenSSL->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading asn1crypto-0.24.0-py2.py3-none-any.whl (101kB) [1G Collecting ipaddress; python_version < "3" (from cryptography>=2.1.4->pyOpenSSL->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading ipaddress-1.0.19.tar.gz [1G Collecting pycparser (from cffi>=1.7; platform_python_implementation != "PyPy"->cryptography>=2.1.4->pyOpenSSL->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading pycparser-2.18.tar.gz (245kB) [1G Installing collected packages: dumptruck, idna, urllib3, certifi, chardet, requests, scraperwiki, lxml, PyDispatcher, zope.interface, constantly, incremental, six, attrs, Automat, hyperlink, Twisted, pycparser, cffi, enum34, asn1crypto, ipaddress, cryptography, pyOpenSSL, queuelib, cssselect, w3lib, parsel, pyasn1, pyasn1-modules, service-identity, scrapy [1G Running setup.py install for dumptruck: started [1G Running setup.py install for dumptruck: finished with status 'done' [1G Running setup.py develop for scraperwiki [1G Running setup.py install for lxml: started [1G Running setup.py install for lxml: still running... [1G Running setup.py install for lxml: finished with status 'done' [1G Running setup.py install for PyDispatcher: started [1G Running setup.py install for PyDispatcher: finished with status 'done' [1G Running setup.py install for Twisted: started [1G Running setup.py install for Twisted: finished with status 'done' [1G Running setup.py install for pycparser: started [1G Running setup.py install for pycparser: finished with status 'done' [1G Running setup.py install for ipaddress: started [1G Running setup.py install for ipaddress: finished with status 'done' [1G Running setup.py install for cssselect: started [1G Running setup.py install for cssselect: finished with status 'done' [1G Successfully installed Automat-0.6.0 PyDispatcher-2.0.5 Twisted-17.9.0 asn1crypto-0.24.0 attrs-17.3.0 certifi-2017.11.5 cffi-1.11.2 chardet-3.0.4 constantly-15.1.0 cryptography-2.1.4 cssselect-0.9.1 dumptruck-0.1.6 enum34-1.1.6 hyperlink-17.3.1 idna-2.6 incremental-17.5.0 ipaddress-1.0.19 lxml-3.4.4 parsel-1.2.0 pyOpenSSL-17.5.0 pyasn1-0.4.2 pyasn1-modules-0.2.1 pycparser-2.18 queuelib-1.4.2 requests-2.18.4 scraperwiki scrapy-1.4.0 service-identity-17.0.0 six-1.11.0 urllib3-1.22 w3lib-1.18.0 zope.interface-4.4.3 [1G [1G [1G-----> Discovering process types [1G Procfile declares types -> scraper Injecting scraper and running... 2017-12-19 19:38:21 [scrapy.utils.log] INFO: Scrapy 1.4.0 started (bot: scrapybot) 2017-12-19 19:38:21 [scrapy.utils.log] INFO: Overridden settings: {} 2017-12-19 19:38:21 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.logstats.LogStats'] 2017-12-19 19:38:21 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2017-12-19 19:38:21 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2017-12-19 19:38:21 [scrapy.middleware] INFO: Enabled item pipelines: [] 2017-12-19 19:38:21 [scrapy.core.engine] INFO: Spider opened 2017-12-19 19:38:21 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2017-12-19 19:38:21 [scrapy.extensions.telnet] DEBUG: Telnet console listening on 127.0.0.1:6023 Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '194.40.144.142'. 2017-12-19 19:38:23 [twisted] CRITICAL: Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '194.40.144.142'. 2017-12-19 19:38:23 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET https://gerichte.lu.ch/recht_sprechung/lgve> (failed 1 times): [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern '194.40.144.142'.>] Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '194.40.144.142'. 2017-12-19 19:38:24 [twisted] CRITICAL: Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '194.40.144.142'. 2017-12-19 19:38:24 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET https://gerichte.lu.ch/recht_sprechung/lgve> (failed 2 times): [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern '194.40.144.142'.>] Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '194.40.144.142'. 2017-12-19 19:38:25 [twisted] CRITICAL: Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '194.40.144.142'. 2017-12-19 19:38:25 [scrapy.downloadermiddlewares.retry] DEBUG: Gave up retrying <GET https://gerichte.lu.ch/recht_sprechung/lgve> (failed 3 times): [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern '194.40.144.142'.>] 2017-12-19 19:38:26 [scrapy.core.scraper] ERROR: Error downloading <GET https://gerichte.lu.ch/recht_sprechung/lgve>: [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern '194.40.144.142'.>] 2017-12-19 19:38:26 [scrapy.core.engine] INFO: Closing spider (finished) 2017-12-19 19:38:26 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/exception_count': 3, 'downloader/exception_type_count/twisted.web._newclient.ResponseNeverReceived': 3, 'downloader/request_bytes': 696, 'downloader/request_count': 3, 'downloader/request_method_count/GET': 3, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2017, 12, 19, 19, 38, 26, 108268), 'log_count/CRITICAL': 3, 'log_count/DEBUG': 4, 'log_count/ERROR': 1, 'log_count/INFO': 7, 'memusage/max': 51372032, 'memusage/startup': 51372032, 'retry/count': 2, 'retry/max_reached': 1, 'retry/reason_count/twisted.web._newclient.ResponseNeverReceived': 2, 'scheduler/dequeued': 3, 'scheduler/dequeued/memory': 3, 'scheduler/enqueued': 3, 'scheduler/enqueued/memory': 3, 'start_time': datetime.datetime(2017, 12, 19, 19, 38, 21, 879296)} 2017-12-19 19:38:26 [scrapy.core.engine] INFO: Spider closed (finished)

Data

Downloaded 0 times

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (2.49 MB) Use the API

rows 10 / 3569

Entscheiddatum Kanton URL Referenz Sprache Rubrum Leitentscheid
17.10.2017
LU
5V 17 442
de
Der Erlass oder die Änderung einer Verordnungsbestimmung durch den Regierungsrat stellt keine individuell-konkrete Anordnung dar und erfüllt damit den Verfügungsbegriff nicht. Soweit mit der Rechtsverweigerungs- bzw. Rechtsverzögerungsbeschwerde ein verspätetes Tätigwerden des Kantons Luzern im Hinblick auf die Schaffung der erforderlichen gesetzlichen oder verordnungsmässigen Grundlagen für die definitive Berechnung der Prämienverbilligungsansprüche geltend gemacht wird, kann auf die Beschwerde deshalb nicht eingetreten werden. Mit Erlass der verlangten Prämienverbilligungsverfügungen besteht kein Rechtsschutzinteresse mehr an der Beurteilung der Rechtsverweigerungs- bzw. Rechtsverzögerungsbeschwerde und das Gerichtsverfahren ist insoweit gegenstandslos geworden. Die summarische Prüfung der Sach- und Rechtslage ergibt, dass die Rechtsverzögerungs- bzw. Rechtsverweigerungsbeschwerde unbegründet war und hätte abgewiesen werden müssen, wenn sie nicht ohnehin gegenstandslos geworden wäre.
28.09.2017
LU
7H 17 61
de
Grundsatz der Einheit der Rechtsordnung: Ein Abweichen von tatsächlichen Feststellungen in einem Strafurteil ist nur dann angezeigt, wenn sich aus den Akten oder den konkreten Umständen des Einzelfalls erhebliche Zweifel an deren Richtigkeit ergeben (E. 4.3). In der rechtlichen Würdigung des Sachverhalts ist die Verwaltungsbehörde demgegenüber grundsätzlich frei (E. 4.4).
2017 IV Nr. 7
21.08.2017
LU
5V 17 157
de
Der Aufenthalt in einer Wohnung mit Dienstleistung ist nicht als Pflegeheimaufenthalt zu qualifizieren. Die Regelung, wonach der Aufenthalt in einem Pflegeheim keine neue Zuständigkeit der für die Restfinanzierung zuständigen Gemeinde begründet (§ 6 Abs. 2 Satz 1 in der bis 31.1.2017 geltenden Fassung des PFG), kommt folglich nicht zur Anwendung und es bleibt bei der Zuständigkeit der Wohnsitzgemeinde.
2017 III Nr. 4
29.06.2017
LU
7H 16 303
de
Sozialhilfeverfahren: Die objektive Beweislast greift erst, wenn es sich als unmöglich erweist, einen Sachverhalt festzustellen, der zumindest die Wahrscheinlichkeit für sich hat, der Wirklichkeit zu entsprechen. Die Behörde ist verpflichtet, die zumutbaren Untersuchungshandlungen vorzunehmen.
29.05.2017
LU
SG 15 2
de
Ziff. 2 des Anhangs der Verordnung des Bundesrates vom 20. Juni 2014 über die Anpassung von Tarifstrukturen in der Krankenversicherung (SR 832.102.5; in Kraft seit 1.10.2014) in der bis 31. Dezember 2016 geltenden Fassung verletzt das Gebot der Sachgerechtigkeit und der betriebswirtschaftlichen Bemessung nach Art. 43 Abs. 4 KVG und ist insofern gesetzeswidrig (E. 8-10). Die klagende Privatklinik ist demgemäss berechtigt, der Krankenkasse die bis 31. Dezember 2016 erbrachten ambulanten Leistungen gestützt auf TARMED Version 1.08 in Rechnung zu stellen, ohne Ziff. 2 des Anhangs der erwähnten Verordnung zu berücksichtigen (E. 12).
2017 III Nr. 1
24.05.2017
LU
5V 17 36
de
Nach konstanter bundesgerichtlicher Rechtsprechung handelt es sich bei einem Arbeitsverhältnis auf Abruf, das nach dem Verlust einer Vollzeitstelle nicht freiwillig – sondern zur Überbrückung der Arbeitslosigkeit – eingegangen wurde, um eine notgedrungene Zwischenlösung. Ob eine Normalität bezüglich des Arbeitsverhältnisses auf Abruf eingetreten ist, beurteilt sich anhand der konkreten Situation. Die Auslegung der Arbeitslosenkasse, dass ohne Weiteres von einer als normal zu qualifizierenden Arbeitszeit auszugehen sei, sobald ein Arbeitsverhältnis auf Abruf länger als ein Jahr dauert, widerspricht der vorgenannten bundesgerichtlichen Rechtsprechung.
23.05.2017
LU
7H 16 3
de
Mobilfunk; Pflicht zur Durchführung eines im Bau- und Zonenreglement der Gemeinde vorgesehenen Standortevaluationsverfahrens (E. 2)
12.05.2017
LU
4O 17 1
de
Der codierte Vermerk auf dem Führerausweis allein vermag nicht die Geltung einer Auflage herbeizuführen, die nach objektiven Gesichtspunkten nicht notwendig ist, um die Fahreignung einer Person zu gewährleisten.
2017 II Nr. 5
09.05.2017
LU
7H 16 251
de
Der Einwand der Schutzwürdigkeit eines Biotops kann auch im Rahmen eines Baubewilligungsverfahrens erhoben werden, sofern diese Rüge ausreichend konkretisiert und glaubhaft gemacht wird. Für die Bejahung der Legitimation genügt es, wenn die Beschwerdeführer ausreichend konkretisiert glaubhaft machen, dass ein nach Art. 18 ff. NHG schützenswertes Biotop vorhanden ist.
2017 IV Nr. 5
09.05.2017
LU
3B 17 10
de
Der Betreuungsunterhaltsanspruch des Kinds für den betreuenden Elternteil berechnet sich durch Multiplikation des erweiterten Existenzminimums dieses Elternteils mit dem von ihm geleisteten Betreuungspensum (Betreuungsquotenmethode). Mit einem allfälligen, auch aus dem Betreuungsunterhalt resultierenden Überschuss hat sich der betreuende Elternteil ebenfalls proportional am Barunterhalt der Kinder zu beteiligen.
2017 II Nr. 4

Statistics

Average successful run time: 2 minutes

Total run time: 21 minutes

Total cpu time used: less than a minute

Total disk space used: 2.58 MB

History

  • Manually ran revision 91617ed7 and completed successfully .
    nothing changed in the database
  • Manually ran revision 91617ed7 and completed successfully .
    nothing changed in the database
  • Manually ran revision 91617ed7 and completed successfully .
    nothing changed in the database
  • Manually ran revision bfc01c8e and completed successfully .
    nothing changed in the database
  • Manually ran revision bfc01c8e and completed successfully .
    3569 records added, 3569 records removed in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

LU_morphio