miasiracusa / chrisguags

ratemyprofessors


This is a scraper that runs on Morph. To get started see the documentation

Contributors miasiracusa

Last run failed with status code 998.

Console output of last run

Injecting configuration and compiling...  -----> Python app detected  ! The latest version of Python 2 is python-2.7.14 (you are using python-2.7.9, which is unsupported).  ! We recommend upgrading by specifying the latest version (python-2.7.14).  Learn More: https://devcenter.heroku.com/articles/python-runtimes -----> Installing python-2.7.9 -----> Installing pip -----> Installing requirements with pip  DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to revision morph_defaults) to /app/.heroku/src/scraperwiki  Running command git clone -q http://github.com/openaustralia/scraperwiki-python.git /app/.heroku/src/scraperwiki  Running command git checkout -b morph_defaults --track origin/morph_defaults  Switched to a new branch 'morph_defaults'  Branch morph_defaults set up to track remote branch morph_defaults from origin.  Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 8))  Downloading https://files.pythonhosted.org/packages/63/c7/4f2a2a4ad6c6fa99b14be6b3c1cece9142e2d915aa7c43c908677afc8fa4/lxml-3.4.4.tar.gz (3.5MB)  Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 9))  Downloading https://files.pythonhosted.org/packages/aa/e5/9ee1460d485b94a6d55732eb7ad5b6c084caf73dd6f9cb0bb7d2a78fafe8/cssselect-0.9.1.tar.gz  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/15/27/3330a343de80d6849545b6c7723f8c9a08b4b104de964ac366e7e6b318df/dumptruck-0.1.6.tar.gz  Collecting requests (from scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl (57kB)  Collecting certifi>=2017.4.17 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl (157kB)  Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl (150kB)  Collecting idna<2.9,>=2.5 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl (58kB)  Collecting chardet<3.1.0,>=3.0.2 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133kB)  Building wheels for collected packages: lxml, cssselect, dumptruck  Building wheel for lxml (setup.py): started  Building wheel for lxml (setup.py): still running...  Building wheel for lxml (setup.py): finished with status 'done'  Stored in directory: /tmp/pip-ephem-wheel-cache-2JxxOQ/wheels/f6/df/7b/af9cace9baf95a6e4a2b5790e30da55fc780ddee598314d1ed  Building wheel for cssselect (setup.py): started  Building wheel for cssselect (setup.py): finished with status 'done'  Stored in directory: /tmp/pip-ephem-wheel-cache-2JxxOQ/wheels/45/25/d7/5a3b06d22b1ffb616f868a74729a5a002bcc04d45109b4f223  Building wheel for dumptruck (setup.py): started  Building wheel for dumptruck (setup.py): finished with status 'done'  Stored in directory: /tmp/pip-ephem-wheel-cache-2JxxOQ/wheels/57/df/83/32654ae89119876c7a7db66829bbdb646caa151589dbaf226e  Successfully built lxml cssselect dumptruck  Installing collected packages: dumptruck, certifi, urllib3, idna, chardet, requests, scraperwiki, lxml, cssselect  Running setup.py develop for scraperwiki  Successfully installed certifi-2019.6.16 chardet-3.0.4 cssselect-0.9.1 dumptruck-0.1.6 idna-2.8 lxml-3.4.4 requests-2.22.0 scraperwiki urllib3-1.25.3 DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.  DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... Scraper didn't create an SQLite database in your current working directory called data.sqlite. If you've just created your first scraper and not edited the code yet this is to be expected. To fix this make your scraper write to an SQLite database at data.sqlite. However, this could also be related to an intermittent problem which we're working hard to resolve: https://github.com/openaustralia/morph/issues/1064

Statistics

Total run time: 4 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 20.3 KB

History

  • Manually ran revision 10cfee20 and failed .
  • Created on morph.io

Scraper code

Python

chrisguags / scraper.py