This is a scraper that runs on Morph. To get started see the documentation

Contributors slow-mo

Last run failed with status code 998.

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-3.6.2 -----> Installing pip -----> Installing requirements with pip  Collecting scraperwiki==0.5.1  Downloading scraperwiki-0.5.1.tar.gz (7.7 kB)  Preparing metadata (setup.py): started  Preparing metadata (setup.py): finished with status 'done'  Collecting requests  Downloading requests-2.27.1-py2.py3-none-any.whl (63 kB)  Collecting six  Downloading six-1.16.0-py2.py3-none-any.whl (11 kB)  Collecting sqlalchemy  Downloading SQLAlchemy-1.4.35-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)  Collecting alembic  Downloading alembic-1.7.7-py3-none-any.whl (210 kB)  Collecting importlib-resources  Downloading importlib_resources-5.4.0-py3-none-any.whl (28 kB)  Collecting importlib-metadata  Downloading importlib_metadata-4.8.3-py3-none-any.whl (17 kB)  Collecting Mako  Downloading Mako-1.1.6-py2.py3-none-any.whl (75 kB)  Collecting greenlet!=0.4.17  Downloading greenlet-1.1.2-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (147 kB)  Collecting idna<4,>=2.5  Downloading idna-3.3-py3-none-any.whl (61 kB)  Collecting urllib3<1.27,>=1.21.1  Downloading urllib3-1.26.9-py2.py3-none-any.whl (138 kB)  Collecting charset-normalizer~=2.0.0  Downloading charset_normalizer-2.0.12-py3-none-any.whl (39 kB)  Collecting certifi>=2017.4.17  Downloading certifi-2021.10.8-py2.py3-none-any.whl (149 kB)  Collecting zipp>=0.5  Downloading zipp-3.6.0-py3-none-any.whl (5.3 kB)  Collecting typing-extensions>=3.6.4  Downloading typing_extensions-4.1.1-py3-none-any.whl (26 kB)  Collecting MarkupSafe>=0.9.2  Downloading MarkupSafe-2.0.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (30 kB)  Building wheels for collected packages: scraperwiki  Building wheel for scraperwiki (setup.py): started  Building wheel for scraperwiki (setup.py): finished with status 'done'  Created wheel for scraperwiki: filename=scraperwiki-0.5.1-py3-none-any.whl size=6545 sha256=65c90f06409dc92506f9d6edf19dfcfba2da004b30b9c97eae92d95bb842b662  Stored in directory: /tmp/pip-ephem-wheel-cache-2cd5zvd8/wheels/cd/f8/ac/cd66eb1c557ab40d35c1ed852da3e9b37baa3e21b61906a5cf  Successfully built scraperwiki  Installing collected packages: zipp, typing-extensions, MarkupSafe, importlib-metadata, greenlet, urllib3, sqlalchemy, Mako, importlib-resources, idna, charset-normalizer, certifi, six, requests, alembic, scraperwiki  Successfully installed Mako-1.1.6 MarkupSafe-2.0.1 alembic-1.7.7 certifi-2021.10.8 charset-normalizer-2.0.12 greenlet-1.1.2 idna-3.3 importlib-metadata-4.8.3 importlib-resources-5.4.0 requests-2.27.1 scraperwiki-0.5.1 six-1.16.0 sqlalchemy-1.4.35 typing-extensions-4.1.1 urllib3-1.26.9 zipp-3.6.0   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... /app/.heroku/python/lib/python3.6/site-packages/scraperwiki/sql.py:75: SAWarning: SQLite version (3, 7, 9) is older than 3.7.16, and will not support right nested joins, as are sometimes used in more complex ORM scenarios. SQLAlchemy 1.4 and above no longer tries to rewrite these joins. connect_args={'timeout': DATABASE_TIMEOUT}) Scraper didn't create an SQLite database in your current working directory called data.sqlite. If you've just created your first scraper and not edited the code yet this is to be expected. To fix this make your scraper write to an SQLite database at data.sqlite. However, this could also be related to an intermittent problem which we're working hard to resolve: https://github.com/openaustralia/morph/issues/1064

Statistics

Total run time: 5 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 26 KB

History

  • Manually ran revision dd760ce5 and failed .
  • Manually ran revision c22e1826 and failed .
  • Manually ran revision ea907980 and failed .
  • Manually ran revision 7640d39e and failed .
  • Created on morph.io

Scraper code

Python

test / scraper.py