sk560 / csgo_hltv_scraperv01

CSGO stats parsed from HLTV


Contributors sk560

Last run failed with status code 1.

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-2.7.9  $ pip install -r requirements.txt  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r requirements.txt (line 6))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to morph_defaults) to ./.heroku/src/scraperwiki  Collecting lxml==3.4.4 (from -r requirements.txt (line 8))  Downloading lxml-3.4.4.tar.gz (3.5MB)  Collecting cssselect==0.9.1 (from -r requirements.txt (line 9))  Downloading cssselect-0.9.1.tar.gz  Collecting beautifulsoup4==4.4.1 (from -r requirements.txt (line 10))  Downloading beautifulsoup4-4.4.1-py2-none-any.whl (81kB)  Collecting decorator==4.0.6 (from -r requirements.txt (line 11))  Downloading decorator-4.0.6-py2.py3-none-any.whl  Collecting nltk==3.0.5 (from -r requirements.txt (line 12))  Downloading nltk-3.0.5.zip (1.2MB)  Collecting oauthlib==1.0.3 (from -r requirements.txt (line 13))  Downloading oauthlib-1.0.3.tar.gz (109kB)  Collecting praw==3.3.0 (from -r requirements.txt (line 14))  Downloading praw-3.3.0-py2.py3-none-any.whl (69kB)  Collecting psycopg2==2.6.1 (from -r requirements.txt (line 15))  Downloading psycopg2-2.6.1.tar.gz (371kB)  Collecting python-twitter==2.2 (from -r requirements.txt (line 16))  Downloading python_twitter-2.2-py2-none-any.whl (60kB)  Collecting requests==2.8.0 (from -r requirements.txt (line 17))  Downloading requests-2.8.0-py2.py3-none-any.whl (476kB)  Collecting requests-oauthlib==0.5.0 (from -r requirements.txt (line 18))  Downloading requests_oauthlib-0.5.0-py2.py3-none-any.whl  Collecting schedule==0.3.2 (from -r requirements.txt (line 19))  Downloading schedule-0.3.2.tar.gz  Collecting six==1.10.0 (from -r requirements.txt (line 20))  Downloading six-1.10.0-py2.py3-none-any.whl  Collecting textblob==0.10.0 (from -r requirements.txt (line 21))  Downloading textblob-0.10.0-py2.py3-none-any.whl (633kB)  Collecting update-checker==0.11 (from -r requirements.txt (line 22))  Downloading update_checker-0.11-py2.py3-none-any.whl  Collecting virtualenv==14.0.1 (from -r requirements.txt (line 23))  Downloading virtualenv-14.0.1-py2.py3-none-any.whl (1.8MB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r requirements.txt (line 6))  Downloading dumptruck-0.1.6.tar.gz  Installing collected packages: dumptruck, requests, scraperwiki, lxml, cssselect, beautifulsoup4, decorator, six, nltk, oauthlib, update-checker, praw, psycopg2, requests-oauthlib, python-twitter, schedule, textblob, virtualenv  Running setup.py install for dumptruck: started  Running setup.py install for dumptruck: finished with status 'done'  Running setup.py develop for scraperwiki  Running setup.py install for lxml: started  Running setup.py install for lxml: still running...  Running setup.py install for lxml: finished with status 'done'  Running setup.py install for cssselect: started  Running setup.py install for cssselect: finished with status 'done'  Running setup.py install for nltk: started  Running setup.py install for nltk: finished with status 'done'  Running setup.py install for oauthlib: started  Running setup.py install for oauthlib: finished with status 'done'  Running setup.py install for psycopg2: started  Running setup.py install for psycopg2: finished with status 'done'  Running setup.py install for schedule: started  Running setup.py install for schedule: finished with status 'done'  Successfully installed beautifulsoup4-4.4.1 cssselect-0.9.1 decorator-4.0.6 dumptruck-0.1.6 lxml-3.4.4 nltk-3.0.5 oauthlib-1.0.3 praw-3.3.0 psycopg2-2.6.1 python-twitter-2.2 requests-2.8.0 requests-oauthlib-0.5.0 schedule-0.3.2 scraperwiki six-1.10.0 textblob-0.10.0 update-checker-0.11 virtualenv-14.0.1   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... Traceback (most recent call last): File "scraper.py", line 11, in <module> url = urlparse.urlparse(os.environ["MORPH_DATABASE_URL"]) File "/app/.heroku/python/lib/python2.7/UserDict.py", line 23, in __getitem__ raise KeyError(key) KeyError: 'MORPH_DATABASE_URL'

Statistics

Average successful run time: less than 5 seconds

Total run time: 2 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 39.5 KB

History

  • Manually ran revision 39d34689 and failed .
    nothing changed in the database
  • Manually ran revision 4413539a and completed successfully .
    nothing changed in the database
  • Created on morph.io

Scraper code

csgo_hltv_scraperv01