This is a scraper that runs on Morph. To get started see the documentation

This scraper reads the PM&C Flag Network for all announcements and returns the following format:

title | URL to announcement | date for action | locality (state or australia-wide) | bool halfMast --- | --- | --- | --- | ---

Contributors svict4

Last run failed with status code 1.

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-3.6.2 -----> Installing pip -----> Installing requirements with pip  Obtaining scraperwiki from git+https://github.com/andylolz/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6))  Cloning https://github.com/andylolz/scraperwiki-python.git (to revision morph_defaults) to /app/.heroku/src/scraperwiki  Running command git clone -q https://github.com/andylolz/scraperwiki-python.git /app/.heroku/src/scraperwiki  Running command git checkout -b morph_defaults --track origin/morph_defaults  Switched to a new branch 'morph_defaults'  Branch morph_defaults set up to track remote branch morph_defaults from origin.  Resolved https://github.com/andylolz/scraperwiki-python.git to commit d03e0fb5e8739b54cd3eff48a91f3a8e2a0af195  Preparing metadata (setup.py): started  Preparing metadata (setup.py): finished with status 'done'  Collecting lxml==3.4.4  Downloading lxml-3.4.4.tar.gz (3.5 MB)  Preparing metadata (setup.py): started  Preparing metadata (setup.py): finished with status 'done'  Collecting cssselect==0.9.1  Downloading cssselect-0.9.1.tar.gz (32 kB)  Preparing metadata (setup.py): started  Preparing metadata (setup.py): finished with status 'done'  Collecting python-dateutil==2.6.1  Downloading python_dateutil-2.6.1-py2.py3-none-any.whl (194 kB)  Collecting requests==2.18.4  Downloading requests-2.18.4-py2.py3-none-any.whl (88 kB)  Collecting BeautifulSoup4==4.6.0  Downloading beautifulsoup4-4.6.0-py3-none-any.whl (86 kB)  Collecting six>=1.5  Downloading six-1.16.0-py2.py3-none-any.whl (11 kB)  Collecting urllib3<1.23,>=1.21.1  Downloading urllib3-1.22-py2.py3-none-any.whl (132 kB)  Collecting certifi>=2017.4.17  Downloading certifi-2021.10.8-py2.py3-none-any.whl (149 kB)  Collecting chardet<3.1.0,>=3.0.2  Downloading chardet-3.0.4-py2.py3-none-any.whl (133 kB)  Collecting idna<2.7,>=2.5  Downloading idna-2.6-py2.py3-none-any.whl (56 kB)  Collecting sqlalchemy  Downloading SQLAlchemy-1.4.36-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)  Collecting alembic  Downloading alembic-1.7.7-py3-none-any.whl (210 kB)  Collecting importlib-metadata  Downloading importlib_metadata-4.8.3-py3-none-any.whl (17 kB)  Collecting importlib-resources  Downloading importlib_resources-5.4.0-py3-none-any.whl (28 kB)  Collecting Mako  Downloading Mako-1.1.6-py2.py3-none-any.whl (75 kB)  Collecting greenlet!=0.4.17  Downloading greenlet-1.1.2-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (147 kB)  Collecting zipp>=0.5  Downloading zipp-3.6.0-py3-none-any.whl (5.3 kB)  Collecting typing-extensions>=3.6.4  Downloading typing_extensions-4.1.1-py3-none-any.whl (26 kB)  Collecting MarkupSafe>=0.9.2  Downloading MarkupSafe-2.0.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (30 kB)  Building wheels for collected packages: lxml, cssselect  Building wheel for lxml (setup.py): started  Building wheel for lxml (setup.py): still running...  Building wheel for lxml (setup.py): finished with status 'done'  Created wheel for lxml: filename=lxml-3.4.4-cp36-cp36m-linux_x86_64.whl size=3300619 sha256=0d2538ca253d79a239d78a8f1a245548214565fb080521e3268fe183095dbe5a  Stored in directory: /tmp/pip-ephem-wheel-cache-txy829bv/wheels/6d/4f/4c/af39325568e80f4188c8fc7232557540270ee6293e952d3d87  Building wheel for cssselect (setup.py): started  Building wheel for cssselect (setup.py): finished with status 'done'  Created wheel for cssselect: filename=cssselect-0.9.1-py3-none-any.whl size=27016 sha256=931340a34b8c43664306a451a99a02767c4fdf897e554db03cb49aa083033621  Stored in directory: /tmp/pip-ephem-wheel-cache-txy829bv/wheels/63/71/d5/b5473de5b6bebecb4642ef7ef61b9124f461282297e7db01d0  Successfully built lxml cssselect  Installing collected packages: zipp, typing-extensions, MarkupSafe, importlib-metadata, greenlet, urllib3, sqlalchemy, Mako, importlib-resources, idna, chardet, certifi, six, requests, alembic, scraperwiki, python-dateutil, lxml, cssselect, BeautifulSoup4  Running setup.py develop for scraperwiki  Successfully installed BeautifulSoup4-4.6.0 Mako-1.1.6 MarkupSafe-2.0.1 alembic-1.7.7 certifi-2021.10.8 chardet-3.0.4 cssselect-0.9.1 greenlet-1.1.2 idna-2.6 importlib-metadata-4.8.3 importlib-resources-5.4.0 lxml-3.4.4 python-dateutil-2.6.1 requests-2.18.4 scraperwiki-0.5.1 six-1.16.0 sqlalchemy-1.4.36 typing-extensions-4.1.1 urllib3-1.22 zipp-3.6.0   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... Traceback (most recent call last): File "scraper.py", line 21, in <module> pages = int(soup.select("#block-system-main > div > div > div > div.item-list > ul > li.pager-last.last > a")[0].attrs['href'].split("=")[1]) + 1 IndexError: list index out of range

Data

Downloaded 0 times

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (27 KB) Use the API

rows 0 / 0

Statistics

Average successful run time: 6 minutes

Total run time: 5 days

Total cpu time used: about 2 hours

Total disk space used: 86.5 KB

History

  • Auto ran revision ae0f71f8 and failed .
    nothing changed in the database
  • Auto ran revision ae0f71f8 and failed .
    nothing changed in the database
  • Auto ran revision ae0f71f8 and failed .
    nothing changed in the database
  • Auto ran revision ae0f71f8 and failed .
    nothing changed in the database
  • Auto ran revision ae0f71f8 and failed .
    nothing changed in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

Python

australian-flag-half-mast / scraper.py