Contributors andela-ookoro Gathondu DavidLemayian andela-mabdussalam RyanSept tinamorale celelstine andela-mmakinde

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-3.6.2  $ pip install -r requirements.txt  Collecting appdirs==1.4.3 (from -r /tmp/build/requirements.txt (line 1))  Downloading appdirs-1.4.3-py2.py3-none-any.whl  Collecting backports.ssl-match-hostname==3.5.0.1 (from -r /tmp/build/requirements.txt (line 2))  Downloading backports.ssl_match_hostname-3.5.0.1.tar.gz  Collecting beautifulsoup4==4.5.3 (from -r /tmp/build/requirements.txt (line 3))  Downloading beautifulsoup4-4.5.3-py3-none-any.whl (85kB)  Collecting boto3==1.4.4 (from -r /tmp/build/requirements.txt (line 4))  Downloading boto3-1.4.4-py2.py3-none-any.whl (127kB)  Collecting botocore==1.5.27 (from -r /tmp/build/requirements.txt (line 5))  Downloading botocore-1.5.27-py2.py3-none-any.whl (3.4MB)  Collecting bs4==0.0.1 (from -r /tmp/build/requirements.txt (line 6))  Downloading bs4-0.0.1.tar.gz  Collecting certifi==2017.4.17 (from -r /tmp/build/requirements.txt (line 7))  Downloading certifi-2017.4.17-py2.py3-none-any.whl (375kB)  Collecting click==6.7 (from -r /tmp/build/requirements.txt (line 8))  Downloading click-6.7-py2.py3-none-any.whl (71kB)  Collecting docutils==0.13.1 (from -r /tmp/build/requirements.txt (line 9))  Downloading docutils-0.13.1-py3-none-any.whl (536kB)  Collecting dumptruck==0.1.6 (from -r /tmp/build/requirements.txt (line 10))  Downloading dumptruck-0.1.6.tar.gz  Collecting elasticsearch==5.4.0 (from -r /tmp/build/requirements.txt (line 11))  Downloading elasticsearch-5.4.0-py2.py3-none-any.whl (58kB)  Collecting Flask==0.12.1 (from -r /tmp/build/requirements.txt (line 12))  Downloading Flask-0.12.1-py2.py3-none-any.whl (82kB)  Collecting futures==3.0.5 (from -r /tmp/build/requirements.txt (line 13))  Downloading futures-3.0.5.tar.gz  Collecting gunicorn==19.7.1 (from -r /tmp/build/requirements.txt (line 14))  Downloading gunicorn-19.7.1-py2.py3-none-any.whl (111kB)  Collecting itsdangerous==0.24 (from -r /tmp/build/requirements.txt (line 15))  Downloading itsdangerous-0.24.tar.gz (46kB)  Collecting Jinja2==2.9.6 (from -r /tmp/build/requirements.txt (line 16))  Downloading Jinja2-2.9.6-py2.py3-none-any.whl (340kB)  Collecting jmespath==0.9.2 (from -r /tmp/build/requirements.txt (line 17))  Downloading jmespath-0.9.2-py2.py3-none-any.whl  Collecting MarkupSafe==1.0 (from -r /tmp/build/requirements.txt (line 18))  Downloading MarkupSafe-1.0.tar.gz  Collecting nose==1.3.7 (from -r /tmp/build/requirements.txt (line 19))  Downloading nose-1.3.7-py3-none-any.whl (154kB)  Collecting packaging==16.8 (from -r /tmp/build/requirements.txt (line 20))  Downloading packaging-16.8-py2.py3-none-any.whl  Collecting pyparsing==2.2.0 (from -r /tmp/build/requirements.txt (line 21))  Downloading pyparsing-2.2.0-py2.py3-none-any.whl (56kB)  Collecting python-dateutil==2.6.0 (from -r /tmp/build/requirements.txt (line 22))  Downloading python_dateutil-2.6.0-py2.py3-none-any.whl (194kB)  Collecting python3-memcached==1.51 (from -r /tmp/build/requirements.txt (line 23))  Downloading python3-memcached-1.51.tar.gz  Collecting requests==2.13.0 (from -r /tmp/build/requirements.txt (line 24))  Downloading requests-2.13.0-py2.py3-none-any.whl (584kB)  Collecting requests-aws4auth==0.9 (from -r /tmp/build/requirements.txt (line 25))  Downloading requests_aws4auth-0.9-py2.py3-none-any.whl (54kB)  Collecting s3transfer==0.1.10 (from -r /tmp/build/requirements.txt (line 26))  Downloading s3transfer-0.1.10-py2.py3-none-any.whl (54kB)  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@732dda1982a3b2073f6341a6a24f9df1bda77fa0#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 27))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to 732dda1982a3b2073f6341a6a24f9df1bda77fa0) to /app/.heroku/src/scraperwiki  Could not find a tag or branch '732dda1982a3b2073f6341a6a24f9df1bda77fa0', assuming commit.  Collecting six==1.10.0 (from -r /tmp/build/requirements.txt (line 28))  Downloading six-1.10.0-py2.py3-none-any.whl  Collecting slack-logger==0.2.0 (from -r /tmp/build/requirements.txt (line 29))  Downloading slack_logger-0.2.0-py3-none-any.whl  Collecting slackclient==1.0.6 (from -r /tmp/build/requirements.txt (line 30))  Downloading slackclient-1.0.6.tar.gz  Collecting slacker==0.9.42 (from -r /tmp/build/requirements.txt (line 31))  Downloading slacker-0.9.42.tar.gz  Collecting termcolor==1.1.0 (from -r /tmp/build/requirements.txt (line 32))  Downloading termcolor-1.1.0.tar.gz  Collecting urllib3==1.21.1 (from -r /tmp/build/requirements.txt (line 33))  Downloading urllib3-1.21.1-py2.py3-none-any.whl (131kB)  Collecting websocket-client==0.40.0 (from -r /tmp/build/requirements.txt (line 34))  Downloading websocket_client-0.40.0.tar.gz (196kB)  Collecting Werkzeug==0.12.2 (from -r /tmp/build/requirements.txt (line 35))  Downloading Werkzeug-0.12.2-py2.py3-none-any.whl (312kB)  Installing collected packages: appdirs, backports.ssl-match-hostname, beautifulsoup4, six, python-dateutil, jmespath, docutils, botocore, s3transfer, boto3, bs4, certifi, click, dumptruck, urllib3, elasticsearch, MarkupSafe, Jinja2, Werkzeug, itsdangerous, Flask, futures, gunicorn, nose, pyparsing, packaging, python3-memcached, requests, requests-aws4auth, scraperwiki, slack-logger, websocket-client, slackclient, slacker, termcolor  Running setup.py install for backports.ssl-match-hostname: started  Running setup.py install for backports.ssl-match-hostname: finished with status 'done'  Running setup.py install for bs4: started  Running setup.py install for bs4: finished with status 'done'  Running setup.py install for dumptruck: started  Running setup.py install for dumptruck: finished with status 'done'  Running setup.py install for MarkupSafe: started  Running setup.py install for MarkupSafe: finished with status 'done'  Running setup.py install for itsdangerous: started  Running setup.py install for itsdangerous: finished with status 'done'  Running setup.py install for futures: started  Running setup.py install for futures: finished with status 'done'  Running setup.py install for python3-memcached: started  Running setup.py install for python3-memcached: finished with status 'done'  Running setup.py develop for scraperwiki  Running setup.py install for websocket-client: started  Running setup.py install for websocket-client: finished with status 'done'  Running setup.py install for slackclient: started  Running setup.py install for slackclient: finished with status 'done'  Running setup.py install for slacker: started  Running setup.py install for slacker: finished with status 'done'  Running setup.py install for termcolor: started  Running setup.py install for termcolor: finished with status 'done'  Successfully installed Flask-0.12.1 Jinja2-2.9.6 MarkupSafe-1.0 Werkzeug-0.12.2 appdirs-1.4.3 backports.ssl-match-hostname-3.5.0.1 beautifulsoup4-4.5.3 boto3-1.4.4 botocore-1.5.27 bs4-0.0.1 certifi-2017.4.17 click-6.7 docutils-0.13.1 dumptruck-0.1.6 elasticsearch-5.4.0 futures-3.0.5 gunicorn-19.7.1 itsdangerous-0.24 jmespath-0.9.2 nose-1.3.7 packaging-16.8 pyparsing-2.2.0 python-dateutil-2.6.0 python3-memcached-1.51 requests-2.13.0 requests-aws4auth-0.9 s3transfer-0.1.10 scraperwiki six-1.10.0 slack-logger-0.2.0 slackclient-1.0.6 slacker-0.9.42 termcolor-1.1.0 urllib3-1.21.1 websocket-client-0.40.0   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... [Doctors Scraper] Started Scraper. [Doctors Scraper] Started Scraper. [Doctors Scraper] Started Scraper. [Doctors Scraper] Started Scraper. Archived: Data has been updated. POST https://search-cfa-htools-fnqfgsmzlye2kdtijchxm5wbcu.eu-west-1.es.amazonaws.com:443/cele-dev/doctors/_delete_by_query?_source=true [status:200 request:1.162s] HEAD https://search-cfa-htools-fnqfgsmzlye2kdtijchxm5wbcu.eu-west-1.es.amazonaws.com:443/cele-dev [status:200 request:0.206s] POST https://search-cfa-htools-fnqfgsmzlye2kdtijchxm5wbcu.eu-west-1.es.amazonaws.com:443/cele-dev/_bulk?refresh=true [status:200 request:5.018s] Elasticsearch: Index successful. [2017-11-22 11:52:56] Scraper completed. 7778 documents retrieved. [Foreign Doctors Scraper] Started Scraper. Archived: Data has been updated. POST https://search-cfa-htools-fnqfgsmzlye2kdtijchxm5wbcu.eu-west-1.es.amazonaws.com:443/cele-dev/doctors/_delete_by_query?_source=true [status:200 request:1.037s] HEAD https://search-cfa-htools-fnqfgsmzlye2kdtijchxm5wbcu.eu-west-1.es.amazonaws.com:443/cele-dev [status:200 request:0.155s] POST https://search-cfa-htools-fnqfgsmzlye2kdtijchxm5wbcu.eu-west-1.es.amazonaws.com:443/cele-dev/_bulk?refresh=true [status:200 request:1.898s] Elasticsearch: Index successful. [2017-11-22 11:53:35] Scraper completed. 1707 documents retrieved. [Clinical Officers Scraper] Started Scraper. Scraper: 11650 is running for more than 10 minutes Archived: Data has been updated. POST https://search-cfa-htools-fnqfgsmzlye2kdtijchxm5wbcu.eu-west-1.es.amazonaws.com:443/cele-dev/clinical-officers/_delete_by_query?_source=true [status:200 request:0.697s] HEAD https://search-cfa-htools-fnqfgsmzlye2kdtijchxm5wbcu.eu-west-1.es.amazonaws.com:443/cele-dev [status:200 request:0.195s] POST https://search-cfa-htools-fnqfgsmzlye2kdtijchxm5wbcu.eu-west-1.es.amazonaws.com:443/cele-dev/_bulk?refresh=true [status:200 request:9.542s] Elasticsearch: Index successful. [2017-11-22 11:59:45] Scraper completed. 12906 documents retrieved. [Nhif Inpatient Scraper] Started Scraper. - ERROR: NHIF Inpatient: set_site_pages_no() - SOURCE: url: http://www.nhif.or.ke/healthinsurance/inpatientServices - MESSAGE: HTTPConnectionPool(host='www.nhif.or.ke', port=80): Max retries exceeded with url: /healthinsurance/inpatientServices (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f09239da588>: Failed to establish a new connection: [Errno -2] Name or service not known',)) - ERROR: scrape_site() - SOURCE: http://www.nhif.or.ke/healthinsurance/inpatientServices - MESSAGE: No pages found. [2017-11-22 11:59:56] Scraper completed. 0 documents retrieved. [Nhif Outpatient Scraper] Started Scraper. Archived: Data has been updated. POST https://search-cfa-htools-fnqfgsmzlye2kdtijchxm5wbcu.eu-west-1.es.amazonaws.com:443/cele-dev/nhif-outpatient/_delete_by_query?_source=true [status:200 request:0.673s] HEAD https://search-cfa-htools-fnqfgsmzlye2kdtijchxm5wbcu.eu-west-1.es.amazonaws.com:443/cele-dev [status:200 request:0.153s] POST https://search-cfa-htools-fnqfgsmzlye2kdtijchxm5wbcu.eu-west-1.es.amazonaws.com:443/cele-dev/_bulk?refresh=true [status:200 request:1.520s] Elasticsearch: Index successful. [2017-11-22 12:00:40] Scraper completed. 1499 documents retrieved. [Nhif Outpatient Cs Scraper] Started Scraper. Archived: Data has been updated. POST https://search-cfa-htools-fnqfgsmzlye2kdtijchxm5wbcu.eu-west-1.es.amazonaws.com:443/cele-dev/nhif-outpatient-cs/_delete_by_query?_source=true [status:200 request:0.666s] HEAD https://search-cfa-htools-fnqfgsmzlye2kdtijchxm5wbcu.eu-west-1.es.amazonaws.com:443/cele-dev [status:200 request:0.155s] POST https://search-cfa-htools-fnqfgsmzlye2kdtijchxm5wbcu.eu-west-1.es.amazonaws.com:443/cele-dev/_bulk?refresh=true [status:200 request:1.664s] Elasticsearch: Index successful. [2017-11-22 12:01:15] Scraper completed. 1736 documents retrieved. - ERROR: archive_data() - SOURCE: /app/data/stats.json - MESSAGE: [Errno 2] No such file or directory: '/app/data/stats/stats-20171122.json' Scraper: 11650 ran for about 0hr:12min:16sec

Statistics

Average successful run time: 18 minutes

Total run time: about 5 hours

Total cpu time used: 12 minutes

Total disk space used: 786 KB

History

  • Manually ran revision d5febf26 and completed successfully .
    nothing changed in the database
  • Manually ran revision 142988a5 and failed .
    nothing changed in the database
  • Manually ran revision e5dc502b and failed .
    nothing changed in the database
  • Manually ran revision 9a979cb1 and failed .
    nothing changed in the database
  • Manually ran revision 9a979cb1 and failed .
    nothing changed in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

healthtool_cele