This is a scraper that runs on Morph. To get started see the documentation

Contributors Voknes

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-3.6.2 -----> Installing pip -----> Installing requirements with pip  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to revision morph_defaults) to /app/.heroku/src/scraperwiki  Running command git clone -q http://github.com/openaustralia/scraperwiki-python.git /app/.heroku/src/scraperwiki  Running command git checkout -b morph_defaults --track origin/morph_defaults  Switched to a new branch 'morph_defaults'  Branch morph_defaults set up to track remote branch morph_defaults from origin.  Collecting lxml==4.3.3 (from -r /tmp/build/requirements.txt (line 8))  Downloading https://files.pythonhosted.org/packages/35/8a/5e066949f2b40caac32c7b2a77da63ad304b5fbe869036cc3fe4a198f724/lxml-4.3.3-cp36-cp36m-manylinux1_x86_64.whl (5.7MB)  Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 9))  Downloading https://files.pythonhosted.org/packages/aa/e5/9ee1460d485b94a6d55732eb7ad5b6c084caf73dd6f9cb0bb7d2a78fafe8/cssselect-0.9.1.tar.gz  Collecting beautifulsoup4==4.7.1 (from -r /tmp/build/requirements.txt (line 10))  Downloading https://files.pythonhosted.org/packages/1d/5d/3260694a59df0ec52f8b4883f5d23b130bc237602a1411fa670eae12351e/beautifulsoup4-4.7.1-py3-none-any.whl (94kB)  Collecting requests==2.21.0 (from -r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl (57kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/15/27/3330a343de80d6849545b6c7723f8c9a08b4b104de964ac366e7e6b318df/dumptruck-0.1.6.tar.gz  Collecting soupsieve>=1.2 (from beautifulsoup4==4.7.1->-r /tmp/build/requirements.txt (line 10))  Downloading https://files.pythonhosted.org/packages/b9/a5/7ea40d0f8676bde6e464a6435a48bc5db09b1a8f4f06d41dd997b8f3c616/soupsieve-1.9.1-py2.py3-none-any.whl  Collecting urllib3<1.25,>=1.21.1 (from requests==2.21.0->-r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/01/11/525b02e4acc0c747de8b6ccdab376331597c569c42ea66ab0a1dbd36eca2/urllib3-1.24.3-py2.py3-none-any.whl (118kB)  Collecting certifi>=2017.4.17 (from requests==2.21.0->-r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl (158kB)  Collecting chardet<3.1.0,>=3.0.2 (from requests==2.21.0->-r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133kB)  Collecting idna<2.9,>=2.5 (from requests==2.21.0->-r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl (58kB)  Building wheels for collected packages: cssselect, dumptruck  Building wheel for cssselect (setup.py): started  Building wheel for cssselect (setup.py): finished with status 'done'  Stored in directory: /tmp/pip-ephem-wheel-cache-n1vk5iyp/wheels/45/25/d7/5a3b06d22b1ffb616f868a74729a5a002bcc04d45109b4f223  Building wheel for dumptruck (setup.py): started  Building wheel for dumptruck (setup.py): finished with status 'done'  Stored in directory: /tmp/pip-ephem-wheel-cache-n1vk5iyp/wheels/57/df/83/32654ae89119876c7a7db66829bbdb646caa151589dbaf226e  Successfully built cssselect dumptruck  Installing collected packages: dumptruck, urllib3, certifi, chardet, idna, requests, scraperwiki, lxml, cssselect, soupsieve, beautifulsoup4  Running setup.py develop for scraperwiki  Successfully installed beautifulsoup4-4.7.1 certifi-2019.3.9 chardet-3.0.4 cssselect-0.9.1 dumptruck-0.1.6 idna-2.8 lxml-4.3.3 requests-2.21.0 scraperwiki soupsieve-1.9.1 urllib3-1.24.3   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... [('16 учебная неделя', '22 мая')]

Data

Downloaded 6 times by Voknes

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (2 KB) Use the API

rows 1 / 1

week date
16 учебная неделя
22 мая

Statistics

Average successful run time: half a minute

Total run time: 22 minutes

Total cpu time used: less than 20 seconds

Total disk space used: 45.9 KB

History

  • Auto ran revision a09f2401 and completed successfully .
    1 record updated in the database
  • Auto ran revision a09f2401 and completed successfully .
    1 record updated in the database
  • Auto ran revision a09f2401 and completed successfully .
    1 record updated in the database
  • Auto ran revision a09f2401 and completed successfully .
    1 record updated in the database
  • Auto ran revision a09f2401 and completed successfully .
    1 record updated in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

Python

week / scraper.py