This is a scraper that runs on Morph. To get started see the documentation

Contributors slow-mo

Last run failed with status code 998.

Console output of last run

Injecting configuration and compiling... [1G [1G-----> Python app detected [1G-----> Installing python-3.6.2 [1G-----> Installing pip [1G-----> Installing requirements with pip [1G Collecting scraperwiki==0.5.1 [1G Downloading scraperwiki-0.5.1.tar.gz (7.7 kB) [1G Preparing metadata ( started [1G Preparing metadata ( finished with status 'done' [1G Collecting requests [1G Downloading requests-2.27.1-py2.py3-none-any.whl (63 kB) [1G Collecting six [1G Downloading six-1.16.0-py2.py3-none-any.whl (11 kB) [1G Collecting sqlalchemy [1G Downloading SQLAlchemy-1.4.35-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB) [1G Collecting alembic [1G Downloading alembic-1.7.7-py3-none-any.whl (210 kB) [1G Collecting importlib-resources [1G Downloading importlib_resources-5.4.0-py3-none-any.whl (28 kB) [1G Collecting importlib-metadata [1G Downloading importlib_metadata-4.8.3-py3-none-any.whl (17 kB) [1G Collecting Mako [1G Downloading Mako-1.1.6-py2.py3-none-any.whl (75 kB) [1G Collecting greenlet!=0.4.17 [1G Downloading greenlet-1.1.2-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (147 kB) [1G Collecting idna<4,>=2.5 [1G Downloading idna-3.3-py3-none-any.whl (61 kB) [1G Collecting urllib3<1.27,>=1.21.1 [1G Downloading urllib3-1.26.9-py2.py3-none-any.whl (138 kB) [1G Collecting charset-normalizer~=2.0.0 [1G Downloading charset_normalizer-2.0.12-py3-none-any.whl (39 kB) [1G Collecting certifi>=2017.4.17 [1G Downloading certifi-2021.10.8-py2.py3-none-any.whl (149 kB) [1G Collecting zipp>=0.5 [1G Downloading zipp-3.6.0-py3-none-any.whl (5.3 kB) [1G Collecting typing-extensions>=3.6.4 [1G Downloading typing_extensions-4.1.1-py3-none-any.whl (26 kB) [1G Collecting MarkupSafe>=0.9.2 [1G Downloading MarkupSafe-2.0.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (30 kB) [1G Building wheels for collected packages: scraperwiki [1G Building wheel for scraperwiki ( started [1G Building wheel for scraperwiki ( finished with status 'done' [1G Created wheel for scraperwiki: filename=scraperwiki-0.5.1-py3-none-any.whl size=6545 sha256=65c90f06409dc92506f9d6edf19dfcfba2da004b30b9c97eae92d95bb842b662 [1G Stored in directory: /tmp/pip-ephem-wheel-cache-2cd5zvd8/wheels/cd/f8/ac/cd66eb1c557ab40d35c1ed852da3e9b37baa3e21b61906a5cf [1G Successfully built scraperwiki [1G Installing collected packages: zipp, typing-extensions, MarkupSafe, importlib-metadata, greenlet, urllib3, sqlalchemy, Mako, importlib-resources, idna, charset-normalizer, certifi, six, requests, alembic, scraperwiki [1G Successfully installed Mako-1.1.6 MarkupSafe-2.0.1 alembic-1.7.7 certifi-2021.10.8 charset-normalizer-2.0.12 greenlet-1.1.2 idna-3.3 importlib-metadata-4.8.3 importlib-resources-5.4.0 requests-2.27.1 scraperwiki-0.5.1 six-1.16.0 sqlalchemy-1.4.35 typing-extensions-4.1.1 urllib3-1.26.9 zipp-3.6.0 [1G [1G [1G-----> Discovering process types [1G Procfile declares types -> scraper Injecting scraper and running... /app/.heroku/python/lib/python3.6/site-packages/scraperwiki/ SAWarning: SQLite version (3, 7, 9) is older than 3.7.16, and will not support right nested joins, as are sometimes used in more complex ORM scenarios. SQLAlchemy 1.4 and above no longer tries to rewrite these joins. connect_args={'timeout': DATABASE_TIMEOUT}) Scraper didn't create an SQLite database in your current working directory called data.sqlite. If you've just created your first scraper and not edited the code yet this is to be expected. To fix this make your scraper write to an SQLite database at data.sqlite. However, this could also be related to an intermittent problem which we're working hard to resolve:


Total run time: 5 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 26 KB


  • Manually ran revision dd760ce5 and failed .
  • Manually ran revision c22e1826 and failed .
  • Manually ran revision ea907980 and failed .
  • Manually ran revision 7640d39e and failed .
  • Created on

Scraper code


test /