PGuyan / BL_morph

Decisions of the higher cantonal court of baselland in switzerland


Contributors PGuyan

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected  ! The latest version of Python 2 is python-2.7.14 (you are using python-2.7.13, which is unsupported).  ! We recommend upgrading by specifying the latest version (python-2.7.14).  Learn More: https://devcenter.heroku.com/articles/python-runtimes -----> Installing python-2.7.13 -----> Installing pip -----> Installing requirements with pip  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to morph_defaults) to /app/.heroku/src/scraperwiki  Collecting scrapy==1.4.0 (from -r /tmp/build/requirements.txt (line 8))  Downloading Scrapy-1.4.0-py2.py3-none-any.whl (248kB)  Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 9))  Downloading lxml-3.4.4.tar.gz (3.5MB)  Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 10))  Downloading cssselect-0.9.1.tar.gz  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading dumptruck-0.1.6.tar.gz  Collecting requests (from scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading requests-2.18.4-py2.py3-none-any.whl (88kB)  Collecting PyDispatcher>=2.0.5 (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading PyDispatcher-2.0.5.tar.gz  Collecting Twisted>=13.1.0 (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading Twisted-17.9.0.tar.bz2 (3.0MB)  Collecting pyOpenSSL (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading pyOpenSSL-17.5.0-py2.py3-none-any.whl (53kB)  Collecting queuelib (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading queuelib-1.4.2-py2.py3-none-any.whl  Collecting parsel>=1.1 (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading parsel-1.2.0-py2.py3-none-any.whl  Collecting service-identity (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading service_identity-17.0.0-py2.py3-none-any.whl  Collecting six>=1.5.2 (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading six-1.11.0-py2.py3-none-any.whl  Collecting w3lib>=1.17.0 (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading w3lib-1.18.0-py2.py3-none-any.whl  Collecting idna<2.7,>=2.5 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading idna-2.6-py2.py3-none-any.whl (56kB)  Collecting urllib3<1.23,>=1.21.1 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading urllib3-1.22-py2.py3-none-any.whl (132kB)  Collecting certifi>=2017.4.17 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading certifi-2017.11.5-py2.py3-none-any.whl (330kB)  Collecting chardet<3.1.0,>=3.0.2 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading chardet-3.0.4-py2.py3-none-any.whl (133kB)  Collecting zope.interface>=3.6.0 (from Twisted>=13.1.0->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading zope.interface-4.4.3-cp27-cp27mu-manylinux1_x86_64.whl (170kB)  Collecting constantly>=15.1 (from Twisted>=13.1.0->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading constantly-15.1.0-py2.py3-none-any.whl  Collecting incremental>=16.10.1 (from Twisted>=13.1.0->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading incremental-17.5.0-py2.py3-none-any.whl  Collecting Automat>=0.3.0 (from Twisted>=13.1.0->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading Automat-0.6.0-py2.py3-none-any.whl  Collecting hyperlink>=17.1.1 (from Twisted>=13.1.0->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading hyperlink-17.3.1-py2.py3-none-any.whl (73kB)  Collecting cryptography>=2.1.4 (from pyOpenSSL->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading cryptography-2.1.4-cp27-cp27mu-manylinux1_x86_64.whl (2.2MB)  Collecting pyasn1 (from service-identity->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading pyasn1-0.4.2-py2.py3-none-any.whl (71kB)  Collecting pyasn1-modules (from service-identity->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading pyasn1_modules-0.2.1-py2.py3-none-any.whl (60kB)  Collecting attrs (from service-identity->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading attrs-17.3.0-py2.py3-none-any.whl  Collecting cffi>=1.7; platform_python_implementation != "PyPy" (from cryptography>=2.1.4->pyOpenSSL->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading cffi-1.11.2-cp27-cp27mu-manylinux1_x86_64.whl (405kB)  Collecting enum34; python_version < "3" (from cryptography>=2.1.4->pyOpenSSL->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading enum34-1.1.6-py2-none-any.whl  Collecting asn1crypto>=0.21.0 (from cryptography>=2.1.4->pyOpenSSL->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading asn1crypto-0.23.0-py2.py3-none-any.whl (99kB)  Collecting ipaddress; python_version < "3" (from cryptography>=2.1.4->pyOpenSSL->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading ipaddress-1.0.18-py2-none-any.whl  Collecting pycparser (from cffi>=1.7; platform_python_implementation != "PyPy"->cryptography>=2.1.4->pyOpenSSL->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8))  Downloading pycparser-2.18.tar.gz (245kB)  Installing collected packages: dumptruck, idna, urllib3, certifi, chardet, requests, scraperwiki, lxml, PyDispatcher, zope.interface, constantly, incremental, six, attrs, Automat, hyperlink, Twisted, pycparser, cffi, enum34, asn1crypto, ipaddress, cryptography, pyOpenSSL, queuelib, cssselect, w3lib, parsel, pyasn1, pyasn1-modules, service-identity, scrapy  Running setup.py install for dumptruck: started  Running setup.py install for dumptruck: finished with status 'done'  Running setup.py develop for scraperwiki  Running setup.py install for lxml: started  Running setup.py install for lxml: still running...  Running setup.py install for lxml: finished with status 'done'  Running setup.py install for PyDispatcher: started  Running setup.py install for PyDispatcher: finished with status 'done'  Running setup.py install for Twisted: started  Running setup.py install for Twisted: finished with status 'done'  Running setup.py install for pycparser: started  Running setup.py install for pycparser: finished with status 'done'  Running setup.py install for cssselect: started  Running setup.py install for cssselect: finished with status 'done'  Successfully installed Automat-0.6.0 PyDispatcher-2.0.5 Twisted-17.9.0 asn1crypto-0.23.0 attrs-17.3.0 certifi-2017.11.5 cffi-1.11.2 chardet-3.0.4 constantly-15.1.0 cryptography-2.1.4 cssselect-0.9.1 dumptruck-0.1.6 enum34-1.1.6 hyperlink-17.3.1 idna-2.6 incremental-17.5.0 ipaddress-1.0.18 lxml-3.4.4 parsel-1.2.0 pyOpenSSL-17.5.0 pyasn1-0.4.2 pyasn1-modules-0.2.1 pycparser-2.18 queuelib-1.4.2 requests-2.18.4 scraperwiki scrapy-1.4.0 service-identity-17.0.0 six-1.11.0 urllib3-1.22 w3lib-1.18.0 zope.interface-4.4.3   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... 2017-12-13 20:26:52 [scrapy.utils.log] INFO: Scrapy 1.4.0 started (bot: scrapybot) 2017-12-13 20:26:52 [scrapy.utils.log] INFO: Overridden settings: {} 2017-12-13 20:26:53 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats', 'scrapy.extensions.corestats.CoreStats'] 2017-12-13 20:26:53 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2017-12-13 20:26:53 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2017-12-13 20:26:53 [scrapy.middleware] INFO: Enabled item pipelines: [] 2017-12-13 20:26:53 [scrapy.core.engine] INFO: Spider opened 2017-12-13 20:26:53 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2017-12-13 20:26:53 [scrapy.extensions.telnet] DEBUG: Telnet console listening on 127.0.0.1:6023 Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '193.47.168.16'. 2017-12-13 20:26:55 [twisted] CRITICAL: Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '193.47.168.16'. 2017-12-13 20:26:55 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET https://www.baselland.ch/politik-und-behorden/gerichte/rechtsprechung/kantonsgericht/chronologische-anordnung/> (failed 1 times): [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern '193.47.168.16'.>] Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '193.47.168.16'. 2017-12-13 20:26:58 [twisted] CRITICAL: Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '193.47.168.16'. 2017-12-13 20:26:58 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET https://www.baselland.ch/politik-und-behorden/gerichte/rechtsprechung/kantonsgericht/chronologische-anordnung/> (failed 2 times): [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern '193.47.168.16'.>] Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '193.47.168.16'. 2017-12-13 20:27:01 [twisted] CRITICAL: Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python2.7/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python2.7/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern '193.47.168.16'. 2017-12-13 20:27:01 [scrapy.downloadermiddlewares.retry] DEBUG: Gave up retrying <GET https://www.baselland.ch/politik-und-behorden/gerichte/rechtsprechung/kantonsgericht/chronologische-anordnung/> (failed 3 times): [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern '193.47.168.16'.>] 2017-12-13 20:27:01 [scrapy.core.scraper] ERROR: Error downloading <GET https://www.baselland.ch/politik-und-behorden/gerichte/rechtsprechung/kantonsgericht/chronologische-anordnung/>: [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern '193.47.168.16'.>] 2017-12-13 20:27:01 [scrapy.core.engine] INFO: Closing spider (finished) 2017-12-13 20:27:01 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/exception_count': 3, 'downloader/exception_type_count/twisted.web._newclient.ResponseNeverReceived': 3, 'downloader/request_bytes': 897, 'downloader/request_count': 3, 'downloader/request_method_count/GET': 3, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2017, 12, 13, 20, 27, 1, 429187), 'log_count/CRITICAL': 3, 'log_count/DEBUG': 4, 'log_count/ERROR': 1, 'log_count/INFO': 7, 'memusage/max': 50999296, 'memusage/startup': 50999296, 'retry/count': 2, 'retry/max_reached': 1, 'retry/reason_count/twisted.web._newclient.ResponseNeverReceived': 2, 'scheduler/dequeued': 3, 'scheduler/dequeued/memory': 3, 'scheduler/enqueued': 3, 'scheduler/enqueued/memory': 3, 'start_time': datetime.datetime(2017, 12, 13, 20, 26, 53, 137913)} 2017-12-13 20:27:01 [scrapy.core.engine] INFO: Spider closed (finished)

Statistics

Average successful run time: 3 minutes

Total run time: 14 minutes

Total cpu time used: half a minute

Total disk space used: 1.43 MB

History

  • Manually ran revision c660b7af and completed successfully .
    nothing changed in the database
  • Manually ran revision ccc96796 and completed successfully .
    2960 records added in the database
  • Manually ran revision 0e91466f and failed .
    nothing changed in the database
  • Manually ran revision 0149bbf4 and failed .
    nothing changed in the database
  • Manually ran revision a874326c and completed successfully .
    nothing changed in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

BL_morph