This is a scraper that runs on Morph. To get started see the documentation

Contributors philippdavidpries

Last run failed with status code 1.

Console output of last run

Injecting configuration and compiling...  -----> Python app detected  ! The latest version of Python 2 is python-2.7.14 (you are using python-2.7.9, which is unsupported).  ! We recommend upgrading by specifying the latest version (python-2.7.14).  Learn More: https://devcenter.heroku.com/articles/python-runtimes -----> Installing python-2.7.9 -----> Installing pip -----> Installing requirements with pip  DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to revision morph_defaults) to /app/.heroku/src/scraperwiki  Running command git clone -q http://github.com/openaustralia/scraperwiki-python.git /app/.heroku/src/scraperwiki  Running command git checkout -b morph_defaults --track origin/morph_defaults  Switched to a new branch 'morph_defaults'  Branch morph_defaults set up to track remote branch morph_defaults from origin.  Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 8))  Downloading https://files.pythonhosted.org/packages/63/c7/4f2a2a4ad6c6fa99b14be6b3c1cece9142e2d915aa7c43c908677afc8fa4/lxml-3.4.4.tar.gz (3.5MB)  Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 9))  Downloading https://files.pythonhosted.org/packages/aa/e5/9ee1460d485b94a6d55732eb7ad5b6c084caf73dd6f9cb0bb7d2a78fafe8/cssselect-0.9.1.tar.gz  Collecting beautifulsoup4==4.6.0 (from -r /tmp/build/requirements.txt (line 10))  Downloading https://files.pythonhosted.org/packages/a6/29/bcbd41a916ad3faf517780a0af7d0254e8d6722ff6414723eedba4334531/beautifulsoup4-4.6.0-py2-none-any.whl (86kB)  Collecting requests==2.18.4 (from -r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/49/df/50aa1999ab9bde74656c2919d9c0c085fd2b3775fd3eca826012bef76d8c/requests-2.18.4-py2.py3-none-any.whl (88kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/15/27/3330a343de80d6849545b6c7723f8c9a08b4b104de964ac366e7e6b318df/dumptruck-0.1.6.tar.gz  Collecting idna<2.7,>=2.5 (from requests==2.18.4->-r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/27/cc/6dd9a3869f15c2edfab863b992838277279ce92663d334df9ecf5106f5c6/idna-2.6-py2.py3-none-any.whl (56kB)  Collecting urllib3<1.23,>=1.21.1 (from requests==2.18.4->-r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/63/cb/6965947c13a94236f6d4b8223e21beb4d576dc72e8130bd7880f600839b8/urllib3-1.22-py2.py3-none-any.whl (132kB)  Collecting certifi>=2017.4.17 (from requests==2.18.4->-r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl (157kB)  Collecting chardet<3.1.0,>=3.0.2 (from requests==2.18.4->-r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133kB)  Building wheels for collected packages: lxml, cssselect, dumptruck  Building wheel for lxml (setup.py): started  Building wheel for lxml (setup.py): still running...  Building wheel for lxml (setup.py): finished with status 'done'  Created wheel for lxml: filename=lxml-3.4.4-cp27-cp27m-linux_x86_64.whl size=2989855 sha256=0a3aa97c94f6076d40bc63808bb0dab1f4948427ab4f5d2427ddfac69713b7e4  Stored in directory: /tmp/pip-ephem-wheel-cache-UlLfGS/wheels/f6/df/7b/af9cace9baf95a6e4a2b5790e30da55fc780ddee598314d1ed  Building wheel for cssselect (setup.py): started  Building wheel for cssselect (setup.py): finished with status 'done'  Created wheel for cssselect: filename=cssselect-0.9.1-cp27-none-any.whl size=26994 sha256=eb1ef2e25f8f5ab4ad7dea56350d7b635af6d5306548cb46a67050a79c71f5e2  Stored in directory: /tmp/pip-ephem-wheel-cache-UlLfGS/wheels/45/25/d7/5a3b06d22b1ffb616f868a74729a5a002bcc04d45109b4f223  Building wheel for dumptruck (setup.py): started  Building wheel for dumptruck (setup.py): finished with status 'done'  Created wheel for dumptruck: filename=dumptruck-0.1.6-cp27-none-any.whl size=11845 sha256=e07ba97b6d7c1bcf0180843cd90ea87f1d3a8eb5747114f3310eab83495b9541  Stored in directory: /tmp/pip-ephem-wheel-cache-UlLfGS/wheels/57/df/83/32654ae89119876c7a7db66829bbdb646caa151589dbaf226e  Successfully built lxml cssselect dumptruck  Installing collected packages: dumptruck, idna, urllib3, certifi, chardet, requests, scraperwiki, lxml, cssselect, beautifulsoup4  Running setup.py develop for scraperwiki  Successfully installed beautifulsoup4-4.6.0 certifi-2019.6.16 chardet-3.0.4 cssselect-0.9.1 dumptruck-0.1.6 idna-2.6 lxml-3.4.4 requests-2.18.4 scraperwiki urllib3-1.22 DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support    -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... Traceback (most recent call last): File "scraper.py", line 2, in <module> soup = BeautifulSoup(html_doc, 'html.parser') NameError: name 'html_doc' is not defined

Data

Downloaded 0 times

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (8 KB) Use the API

rows 10 / 69

td
Date
Hospital
Region
Trolley Total
Ward Total
Total
Beaumont Hospital
Children's University Hospital, Temple Street
Connolly Hospital, Blanchardstown
11

Statistics

Average successful run time: less than 10 seconds

Total run time: 6 minutes

Total cpu time used: less than 10 seconds

Total disk space used: 47.8 KB

History

  • Manually ran revision 6cd3d5a8 and failed .
    nothing changed in the database
  • Manually ran revision 1bde23a3 and failed .
    nothing changed in the database
  • Manually ran revision 8d2b8331 and failed .
    69 records added in the database
  • Manually ran revision 586032a5 and failed .
    nothing changed in the database
  • Manually ran revision df713c93 and failed .
    69 records added in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

Python

test / scraper.py