PGuyan / NE_morphio

NE Entscheide


Contributors PGuyan

Last run completed successfully .

Console output of last run

Injecting configuration and compiling... [1G [1G-----> Python app detected [1G ! The latest version of Python 3 is python-3.6.2 (you are using python-3.6.3, which is unsupported). [1G ! We recommend upgrading by specifying the latest version (python-3.6.2). [1G Learn More: https://devcenter.heroku.com/articles/python-runtimes [1G-----> Installing python-3.6.3 [1G-----> Installing pip [1G-----> Noticed cffi. Bootstrapping libffi. [1G-----> Installing requirements with pip [1G Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6)) [1G Cloning http://github.com/openaustralia/scraperwiki-python.git (to morph_defaults) to /app/.heroku/src/scraperwiki [1G Collecting scrapy==1.4.0 (from -r /tmp/build/requirements.txt (line 8)) [1G Downloading Scrapy-1.4.0-py2.py3-none-any.whl (248kB) [1G Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 9)) [1G Downloading lxml-3.4.4.tar.gz (3.5MB) [1G Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 10)) [1G Downloading cssselect-0.9.1.tar.gz [1G Collecting Twisted==17.9.0 (from -r /tmp/build/requirements.txt (line 11)) [1G Downloading Twisted-17.9.0.tar.bz2 (3.0MB) [1G Collecting pyOpenSSL==17.5.0 (from -r /tmp/build/requirements.txt (line 12)) [1G Downloading pyOpenSSL-17.5.0-py2.py3-none-any.whl (53kB) [1G Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading dumptruck-0.1.6.tar.gz [1G Collecting requests (from scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading requests-2.18.4-py2.py3-none-any.whl (88kB) [1G Collecting queuelib (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading queuelib-1.5.0-py2.py3-none-any.whl [1G Collecting service-identity (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading service_identity-17.0.0-py2.py3-none-any.whl [1G Collecting w3lib>=1.17.0 (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading w3lib-1.19.0-py2.py3-none-any.whl [1G Collecting PyDispatcher>=2.0.5 (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading PyDispatcher-2.0.5.tar.gz [1G Collecting six>=1.5.2 (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading six-1.11.0-py2.py3-none-any.whl [1G Collecting parsel>=1.1 (from scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading parsel-1.4.0-py2.py3-none-any.whl [1G Collecting zope.interface>=4.0.2 (from Twisted==17.9.0->-r /tmp/build/requirements.txt (line 11)) [1G Downloading zope.interface-4.4.3-cp36-cp36m-manylinux1_x86_64.whl (173kB) [1G Collecting constantly>=15.1 (from Twisted==17.9.0->-r /tmp/build/requirements.txt (line 11)) [1G Downloading constantly-15.1.0-py2.py3-none-any.whl [1G Collecting incremental>=16.10.1 (from Twisted==17.9.0->-r /tmp/build/requirements.txt (line 11)) [1G Downloading incremental-17.5.0-py2.py3-none-any.whl [1G Collecting Automat>=0.3.0 (from Twisted==17.9.0->-r /tmp/build/requirements.txt (line 11)) [1G Downloading Automat-0.6.0-py2.py3-none-any.whl [1G Collecting hyperlink>=17.1.1 (from Twisted==17.9.0->-r /tmp/build/requirements.txt (line 11)) [1G Downloading hyperlink-18.0.0-py2.py3-none-any.whl [1G Collecting cryptography>=2.1.4 (from pyOpenSSL==17.5.0->-r /tmp/build/requirements.txt (line 12)) [1G Downloading cryptography-2.1.4-cp36-cp36m-manylinux1_x86_64.whl (2.2MB) [1G Collecting idna<2.7,>=2.5 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading idna-2.6-py2.py3-none-any.whl (56kB) [1G Collecting certifi>=2017.4.17 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading certifi-2018.1.18-py2.py3-none-any.whl (151kB) [1G Collecting urllib3<1.23,>=1.21.1 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading urllib3-1.22-py2.py3-none-any.whl (132kB) [1G Collecting chardet<3.1.0,>=3.0.2 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading chardet-3.0.4-py2.py3-none-any.whl (133kB) [1G Collecting pyasn1-modules (from service-identity->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading pyasn1_modules-0.2.1-py2.py3-none-any.whl (60kB) [1G Collecting attrs (from service-identity->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading attrs-17.4.0-py2.py3-none-any.whl [1G Collecting pyasn1 (from service-identity->scrapy==1.4.0->-r /tmp/build/requirements.txt (line 8)) [1G Downloading pyasn1-0.4.2-py2.py3-none-any.whl (71kB) [1G Collecting cffi>=1.7; platform_python_implementation != "PyPy" (from cryptography>=2.1.4->pyOpenSSL==17.5.0->-r /tmp/build/requirements.txt (line 12)) [1G Downloading cffi-1.11.5-cp36-cp36m-manylinux1_x86_64.whl (421kB) [1G Collecting asn1crypto>=0.21.0 (from cryptography>=2.1.4->pyOpenSSL==17.5.0->-r /tmp/build/requirements.txt (line 12)) [1G Downloading asn1crypto-0.24.0-py2.py3-none-any.whl (101kB) [1G Collecting pycparser (from cffi>=1.7; platform_python_implementation != "PyPy"->cryptography>=2.1.4->pyOpenSSL==17.5.0->-r /tmp/build/requirements.txt (line 12)) [1G Downloading pycparser-2.18.tar.gz (245kB) [1G Installing collected packages: dumptruck, idna, certifi, urllib3, chardet, requests, scraperwiki, queuelib, pyasn1, pyasn1-modules, pycparser, cffi, six, asn1crypto, cryptography, pyOpenSSL, attrs, service-identity, w3lib, cssselect, lxml, PyDispatcher, zope.interface, constantly, incremental, Automat, hyperlink, Twisted, parsel, scrapy [1G Running setup.py install for dumptruck: started [1G Running setup.py install for dumptruck: finished with status 'done' [1G Running setup.py develop for scraperwiki [1G Running setup.py install for pycparser: started [1G Running setup.py install for pycparser: finished with status 'done' [1G Running setup.py install for cssselect: started [1G Running setup.py install for cssselect: finished with status 'done' [1G Running setup.py install for lxml: started [1G Running setup.py install for lxml: still running... [1G Running setup.py install for lxml: finished with status 'done' [1G Running setup.py install for PyDispatcher: started [1G Running setup.py install for PyDispatcher: finished with status 'done' [1G Running setup.py install for Twisted: started [1G Running setup.py install for Twisted: finished with status 'done' [1G Successfully installed Automat-0.6.0 PyDispatcher-2.0.5 Twisted-17.9.0 asn1crypto-0.24.0 attrs-17.4.0 certifi-2018.1.18 cffi-1.11.5 chardet-3.0.4 constantly-15.1.0 cryptography-2.1.4 cssselect-0.9.1 dumptruck-0.1.6 hyperlink-18.0.0 idna-2.6 incremental-17.5.0 lxml-3.4.4 parsel-1.4.0 pyOpenSSL-17.5.0 pyasn1-0.4.2 pyasn1-modules-0.2.1 pycparser-2.18 queuelib-1.5.0 requests-2.18.4 scraperwiki scrapy-1.4.0 service-identity-17.0.0 six-1.11.0 urllib3-1.22 w3lib-1.19.0 zope.interface-4.4.3 [1G [1G [1G-----> Discovering process types [1G Procfile declares types -> scraper Injecting scraper and running... 2018-03-17 09:45:41 [scrapy.utils.log] INFO: Scrapy 1.4.0 started (bot: scrapybot) 2018-03-17 09:45:41 [scrapy.utils.log] INFO: Overridden settings: {} 2018-03-17 09:45:42 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.logstats.LogStats'] 2018-03-17 09:45:42 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2018-03-17 09:45:42 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2018-03-17 09:45:42 [scrapy.middleware] INFO: Enabled item pipelines: [] 2018-03-17 09:45:42 [scrapy.core.engine] INFO: Spider opened 2018-03-17 09:45:42 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2018-03-17 09:45:42 [scrapy.extensions.telnet] DEBUG: Telnet console listening on 127.0.0.1:6023 Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python3.6/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python3.6/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python3.6/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python3.6/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python3.6/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python3.6/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python3.6/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python3.6/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python3.6/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern b'145.232.192.174'. 2018-03-17 09:45:44 [twisted] CRITICAL: Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python3.6/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python3.6/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python3.6/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python3.6/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python3.6/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python3.6/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python3.6/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python3.6/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python3.6/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern b'145.232.192.174'. 2018-03-17 09:45:44 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET https://www.findinfo-tc.vd.ch/justice/findinfo-pub/internet/SimpleSearch.action?showPage=> (failed 1 times): [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern b'145.232.192.174'.>] Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python3.6/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python3.6/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python3.6/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python3.6/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python3.6/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python3.6/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python3.6/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python3.6/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python3.6/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern b'145.232.192.174'. 2018-03-17 09:45:45 [twisted] CRITICAL: Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python3.6/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python3.6/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python3.6/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python3.6/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python3.6/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python3.6/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python3.6/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python3.6/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python3.6/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern b'145.232.192.174'. 2018-03-17 09:45:45 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET https://www.findinfo-tc.vd.ch/justice/findinfo-pub/internet/SimpleSearch.action?showPage=> (failed 2 times): [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern b'145.232.192.174'.>] Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python3.6/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python3.6/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python3.6/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python3.6/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python3.6/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python3.6/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python3.6/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python3.6/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python3.6/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern b'145.232.192.174'. 2018-03-17 09:45:46 [twisted] CRITICAL: Error during info_callback Traceback (most recent call last): File "/app/.heroku/python/lib/python3.6/site-packages/twisted/protocols/tls.py", line 315, in dataReceived self._checkHandshakeStatus() File "/app/.heroku/python/lib/python3.6/site-packages/twisted/protocols/tls.py", line 235, in _checkHandshakeStatus self._tlsConnection.do_handshake() File "/app/.heroku/python/lib/python3.6/site-packages/OpenSSL/SSL.py", line 1805, in do_handshake result = _lib.SSL_do_handshake(self._ssl) File "/app/.heroku/python/lib/python3.6/site-packages/OpenSSL/SSL.py", line 1226, in wrapper callback(Connection._reverse_mapping[ssl], where, return_code) --- <exception caught here> --- File "/app/.heroku/python/lib/python3.6/site-packages/twisted/internet/_sslverify.py", line 1102, in infoCallback return wrapped(connection, where, ret) File "/app/.heroku/python/lib/python3.6/site-packages/scrapy/core/downloader/tls.py", line 67, in _identityVerifyingInfoCallback verifyHostname(connection, self._hostnameASCII) File "/app/.heroku/python/lib/python3.6/site-packages/service_identity/pyopenssl.py", line 47, in verify_hostname cert_patterns=extract_ids(connection.get_peer_certificate()), File "/app/.heroku/python/lib/python3.6/site-packages/service_identity/pyopenssl.py", line 75, in extract_ids ids.append(DNSPattern(n.getComponent().asOctets())) File "/app/.heroku/python/lib/python3.6/site-packages/service_identity/_common.py", line 156, in __init__ "Invalid DNS pattern {0!r}.".format(pattern) service_identity.exceptions.CertificateError: Invalid DNS pattern b'145.232.192.174'. 2018-03-17 09:45:46 [scrapy.downloadermiddlewares.retry] DEBUG: Gave up retrying <GET https://www.findinfo-tc.vd.ch/justice/findinfo-pub/internet/SimpleSearch.action?showPage=> (failed 3 times): [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern b'145.232.192.174'.>] 2018-03-17 09:45:46 [scrapy.core.scraper] ERROR: Error downloading <GET https://www.findinfo-tc.vd.ch/justice/findinfo-pub/internet/SimpleSearch.action?showPage=>: [<twisted.python.failure.Failure service_identity.exceptions.CertificateError: Invalid DNS pattern b'145.232.192.174'.>] 2018-03-17 09:45:46 [scrapy.core.engine] INFO: Closing spider (finished) 2018-03-17 09:45:46 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/exception_count': 3, 'downloader/exception_type_count/twisted.web._newclient.ResponseNeverReceived': 3, 'downloader/request_bytes': 834, 'downloader/request_count': 3, 'downloader/request_method_count/GET': 3, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2018, 3, 17, 9, 45, 46, 893380), 'log_count/CRITICAL': 3, 'log_count/DEBUG': 4, 'log_count/ERROR': 1, 'log_count/INFO': 7, 'memusage/max': 45723648, 'memusage/startup': 45723648, 'retry/count': 2, 'retry/max_reached': 1, 'retry/reason_count/twisted.web._newclient.ResponseNeverReceived': 2, 'scheduler/dequeued': 3, 'scheduler/dequeued/memory': 3, 'scheduler/enqueued': 3, 'scheduler/enqueued/memory': 3, 'start_time': datetime.datetime(2018, 3, 17, 9, 45, 42, 245229)} 2018-03-17 09:45:46 [scrapy.core.engine] INFO: Spider closed (finished)

Statistics

Average successful run time: 2 minutes

Total run time: 37 minutes

Total cpu time used: half a minute

Total disk space used: 81.7 KB

History

  • Manually ran revision 1f8e4d7f and completed successfully .
    nothing changed in the database
  • Manually ran revision 1f8e4d7f and failed .
    nothing changed in the database
  • Manually ran revision 1f8e4d7f and completed successfully .
    nothing changed in the database
  • Manually ran revision a938b23f and completed successfully .
    nothing changed in the database
  • Manually ran revision a938b23f and completed successfully .
    nothing changed in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

NE_morphio