This is a scraper that runs on Morph. To get started see the documentation

Contributors mlandauer

Last run failed with status code 998.

Console output of last run

Injecting configuration and compiling... [1G [1G-----> Python app detected [1G-----> Using Python version specified in runtime.txt [1G ! Python has released a security update! Please consider upgrading to python-3.7.13 [1G Learn More: https://devcenter.heroku.com/articles/python-runtimes [1G-----> Installing python-3.7.14 [1G-----> Installing pip 22.0.4, setuptools 60.10.0 and wheel 0.37.1 [1G-----> Installing SQLite3 [1G-----> Installing requirements with pip [1G Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6)) [1G Cloning http://github.com/openaustralia/scraperwiki-python.git (to revision morph_defaults) to /app/.heroku/src/scraperwiki [1G Running command git clone --filter=blob:none --quiet http://github.com/openaustralia/scraperwiki-python.git /app/.heroku/src/scraperwiki [1G warning: redirecting to https://github.com/openaustralia/scraperwiki-python.git/ [1G Running command git checkout -b morph_defaults --track origin/morph_defaults [1G warning: redirecting to https://github.com/openaustralia/scraperwiki-python.git/ [1G Switched to a new branch 'morph_defaults' [1G Branch 'morph_defaults' set up to track remote branch 'morph_defaults' from 'origin'. [1G Resolved http://github.com/openaustralia/scraperwiki-python.git to commit 732dda1982a3b2073f6341a6a24f9df1bda77fa0 [1G Preparing metadata (setup.py): started [1G Preparing metadata (setup.py): finished with status 'done' [1G Collecting lxml==4.9.1 [1G Downloading lxml-4.9.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl (6.4 MB) [1G Collecting cssselect==0.9.1 [1G Downloading cssselect-0.9.1.tar.gz (32 kB) [1G Preparing metadata (setup.py): started [1G Preparing metadata (setup.py): finished with status 'done' [1G Collecting dumptruck>=0.1.2 [1G Downloading dumptruck-0.1.6.tar.gz (15 kB) [1G Preparing metadata (setup.py): started [1G Preparing metadata (setup.py): finished with status 'done' [1G Collecting requests [1G Downloading requests-2.28.1-py3-none-any.whl (62 kB) [1G Collecting certifi>=2017.4.17 [1G Downloading certifi-2022.9.24-py3-none-any.whl (161 kB) [1G Collecting charset-normalizer<3,>=2 [1G Downloading charset_normalizer-2.1.1-py3-none-any.whl (39 kB) [1G Collecting idna<4,>=2.5 [1G Downloading idna-3.4-py3-none-any.whl (61 kB) [1G Collecting urllib3<1.27,>=1.21.1 [1G Downloading urllib3-1.26.12-py2.py3-none-any.whl (140 kB) [1G Building wheels for collected packages: cssselect, dumptruck [1G Building wheel for cssselect (setup.py): started [1G Building wheel for cssselect (setup.py): finished with status 'done' [1G Created wheel for cssselect: filename=cssselect-0.9.1-py3-none-any.whl size=27016 sha256=dd2bf09f4320b58c52953383e966c331a8769c0507579fc09489cfd45cf07216 [1G Stored in directory: /tmp/pip-ephem-wheel-cache-_qtqay3j/wheels/1c/a2/c8/3536341313e96933c9b4f1c03142fb8452d06b50cfde950353 [1G Building wheel for dumptruck (setup.py): started [1G Building wheel for dumptruck (setup.py): finished with status 'done' [1G Created wheel for dumptruck: filename=dumptruck-0.1.6-py3-none-any.whl size=11842 sha256=78e65aa867000d78411f776270b494f6663adfdcbdab33b40c2992af9d7591de [1G Stored in directory: /tmp/pip-ephem-wheel-cache-_qtqay3j/wheels/17/63/71/a91825bec93f8a1cd3d294786410c42410fccf8365815de75f [1G Successfully built cssselect dumptruck [1G Installing collected packages: dumptruck, cssselect, urllib3, lxml, idna, charset-normalizer, certifi, requests, scraperwiki [1G Running setup.py develop for scraperwiki [1G Successfully installed certifi-2022.9.24 charset-normalizer-2.1.1 cssselect-0.9.1 dumptruck-0.1.6 idna-3.4 lxml-4.9.1 requests-2.28.1 scraperwiki-0.3.7 urllib3-1.26.12 [1G [1G-----> Discovering process types [1G Procfile declares types -> scraper Injecting scraper and running... Scraper didn't create an SQLite database in your current working directory called data.sqlite. If you've just created your first scraper and not edited the code yet this is to be expected. To fix this make your scraper write to an SQLite database at data.sqlite.

Statistics

Total run time: 2 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 12.6 KB

History

  • Manually ran revision 91f3ff2c and failed .
  • Manually ran revision 7acda2ba and failed .
  • Manually ran revision 750972e8 and failed .
  • Manually ran and failed .
  • Manually ran revision 750972e8 and failed .
  • ...
  • Created on morph.io

Show complete history

Scraper code

Python

test11 / scraper.py