Contributors harrisonweber

Last run failed with status code 1.

Console output of last run

Injecting configuration and compiling... [1G [1G-----> Python app detected [1G-----> Installing python-2.7.9 [1G $ pip install -r requirements.txt [1G Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6)) [1G Cloning http://github.com/openaustralia/scraperwiki-python.git (to morph_defaults) to /app/.heroku/src/scraperwiki [1G Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 8)) [1G Downloading lxml-3.4.4.tar.gz (3.5MB) [1G Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 9)) [1G Downloading cssselect-0.9.1.tar.gz [1G Collecting requests==2.18.4 (from -r /tmp/build/requirements.txt (line 10)) [1G Downloading requests-2.18.4-py2.py3-none-any.whl (88kB) [1G Collecting beautifulsoup4 (from -r /tmp/build/requirements.txt (line 11)) [1G Downloading beautifulsoup4-4.6.0-py2-none-any.whl (86kB) [1G Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 6)) [1G Downloading dumptruck-0.1.6.tar.gz [1G Collecting idna<2.7,>=2.5 (from requests==2.18.4->-r /tmp/build/requirements.txt (line 10)) [1G Downloading idna-2.6-py2.py3-none-any.whl (56kB) [1G Collecting urllib3<1.23,>=1.21.1 (from requests==2.18.4->-r /tmp/build/requirements.txt (line 10)) [1G Downloading urllib3-1.22-py2.py3-none-any.whl (132kB) [1G Collecting certifi>=2017.4.17 (from requests==2.18.4->-r /tmp/build/requirements.txt (line 10)) [1G Downloading certifi-2017.7.27.1-py2.py3-none-any.whl (349kB) [1G Collecting chardet<3.1.0,>=3.0.2 (from requests==2.18.4->-r /tmp/build/requirements.txt (line 10)) [1G Downloading chardet-3.0.4-py2.py3-none-any.whl (133kB) [1G Installing collected packages: dumptruck, idna, urllib3, certifi, chardet, requests, scraperwiki, lxml, cssselect, beautifulsoup4 [1G Running setup.py install for dumptruck: started [1G Running setup.py install for dumptruck: finished with status 'done' [1G Running setup.py develop for scraperwiki [1G Running setup.py install for lxml: started [1G Running setup.py install for lxml: still running... [1G Running setup.py install for lxml: finished with status 'done' [1G Running setup.py install for cssselect: started [1G Running setup.py install for cssselect: finished with status 'done' [1G Successfully installed beautifulsoup4-4.6.0 certifi-2017.7.27.1 chardet-3.0.4 cssselect-0.9.1 dumptruck-0.1.6 idna-2.6 lxml-3.4.4 requests-2.18.4 scraperwiki urllib3-1.22 [1G [1G [1G-----> Discovering process types [1G Procfile declares types -> scraper Injecting scraper and running... Traceback (most recent call last): File "scraper.py", line 35, in <module> source_urls_list = read_file('/Users/haweber/Desktop/source_urls.csv') File "scraper.py", line 31, in read_file with open(file_path) as csvFile: IOError: [Errno 2] No such file or directory: '/Users/haweber/Desktop/source_urls.csv'

Statistics

Average successful run time: 2 minutes

Total run time: 3 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 22.3 KB

History

  • Manually ran revision b1f113a3 and failed .
    nothing changed in the database
  • Manually ran revision ed3ce684 and completed successfully .
    nothing changed in the database
  • Created on morph.io