Injecting configuration and compiling...
[1G [1G-----> Python app detected
[1G-----> Installing python-2.7.9
[1G $ pip install -r requirements.txt
[1G Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r requirements.txt (line 6))
[1G Cloning http://github.com/openaustralia/scraperwiki-python.git (to morph_defaults) to ./.heroku/src/scraperwiki
[1G Collecting lxml==3.4.4 (from -r requirements.txt (line 8))
[1G Downloading lxml-3.4.4.tar.gz (3.5MB)
[1G Collecting cssselect==0.9.1 (from -r requirements.txt (line 9))
[1G Downloading cssselect-0.9.1.tar.gz
[1G Collecting beautifulsoup4==4.4.1 (from -r requirements.txt (line 10))
[1G Downloading beautifulsoup4-4.4.1-py2-none-any.whl (81kB)
[1G Collecting decorator==4.0.6 (from -r requirements.txt (line 11))
[1G Downloading decorator-4.0.6-py2.py3-none-any.whl
[1G Collecting nltk==3.0.5 (from -r requirements.txt (line 12))
[1G Downloading nltk-3.0.5.zip (1.2MB)
[1G Collecting oauthlib==1.0.3 (from -r requirements.txt (line 13))
[1G Downloading oauthlib-1.0.3.tar.gz (109kB)
[1G Collecting praw==3.3.0 (from -r requirements.txt (line 14))
[1G Downloading praw-3.3.0-py2.py3-none-any.whl (69kB)
[1G Collecting psycopg2==2.6.1 (from -r requirements.txt (line 15))
[1G Downloading psycopg2-2.6.1.tar.gz (371kB)
[1G Collecting python-twitter==2.2 (from -r requirements.txt (line 16))
[1G Downloading python_twitter-2.2-py2-none-any.whl (60kB)
[1G Collecting requests==2.8.0 (from -r requirements.txt (line 17))
[1G Downloading requests-2.8.0-py2.py3-none-any.whl (476kB)
[1G Collecting requests-oauthlib==0.5.0 (from -r requirements.txt (line 18))
[1G Downloading requests_oauthlib-0.5.0-py2.py3-none-any.whl
[1G Collecting schedule==0.3.2 (from -r requirements.txt (line 19))
[1G Downloading schedule-0.3.2.tar.gz
[1G Collecting six==1.10.0 (from -r requirements.txt (line 20))
[1G Downloading six-1.10.0-py2.py3-none-any.whl
[1G Collecting textblob==0.10.0 (from -r requirements.txt (line 21))
[1G Downloading textblob-0.10.0-py2.py3-none-any.whl (633kB)
[1G Collecting update-checker==0.11 (from -r requirements.txt (line 22))
[1G Downloading update_checker-0.11-py2.py3-none-any.whl
[1G Collecting virtualenv==14.0.1 (from -r requirements.txt (line 23))
[1G Downloading virtualenv-14.0.1-py2.py3-none-any.whl (1.8MB)
[1G Collecting dumptruck>=0.1.2 (from scraperwiki->-r requirements.txt (line 6))
[1G Downloading dumptruck-0.1.6.tar.gz
[1G Installing collected packages: dumptruck, requests, scraperwiki, lxml, cssselect, beautifulsoup4, decorator, six, nltk, oauthlib, update-checker, praw, psycopg2, requests-oauthlib, python-twitter, schedule, textblob, virtualenv
[1G Running setup.py install for dumptruck: started
[1G Running setup.py install for dumptruck: finished with status 'done'
[1G Running setup.py develop for scraperwiki
[1G Running setup.py install for lxml: started
[1G Running setup.py install for lxml: still running...
[1G Running setup.py install for lxml: finished with status 'done'
[1G Running setup.py install for cssselect: started
[1G Running setup.py install for cssselect: finished with status 'done'
[1G Running setup.py install for nltk: started
[1G Running setup.py install for nltk: finished with status 'done'
[1G Running setup.py install for oauthlib: started
[1G Running setup.py install for oauthlib: finished with status 'done'
[1G Running setup.py install for psycopg2: started
[1G Running setup.py install for psycopg2: finished with status 'done'
[1G Running setup.py install for schedule: started
[1G Running setup.py install for schedule: finished with status 'done'
[1G Successfully installed beautifulsoup4-4.4.1 cssselect-0.9.1 decorator-4.0.6 dumptruck-0.1.6 lxml-3.4.4 nltk-3.0.5 oauthlib-1.0.3 praw-3.3.0 psycopg2-2.6.1 python-twitter-2.2 requests-2.8.0 requests-oauthlib-0.5.0 schedule-0.3.2 scraperwiki six-1.10.0 textblob-0.10.0 update-checker-0.11 virtualenv-14.0.1
[1G
[1G [1G-----> Discovering process types
[1G Procfile declares types -> scraper
Injecting scraper and running...
Traceback (most recent call last):
File "scraper.py", line 11, in <module>
url = urlparse.urlparse(os.environ["MORPH_DATABASE_URL"])
File "/app/.heroku/python/lib/python2.7/UserDict.py", line 23, in __getitem__
raise KeyError(key)
KeyError: 'MORPH_DATABASE_URL'