This is a scraper that runs on Morph. To get started see the documentation

Contributors Voknes

Last run failed with status code 1.

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-3.6.2 -----> Installing pip -----> Installing requirements with pip  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to revision morph_defaults) to /app/.heroku/src/scraperwiki  Running command git clone -q http://github.com/openaustralia/scraperwiki-python.git /app/.heroku/src/scraperwiki  Running command git checkout -b morph_defaults --track origin/morph_defaults  Switched to a new branch 'morph_defaults'  Branch morph_defaults set up to track remote branch morph_defaults from origin.  Collecting lxml==4.3.3 (from -r /tmp/build/requirements.txt (line 8))  Downloading https://files.pythonhosted.org/packages/35/8a/5e066949f2b40caac32c7b2a77da63ad304b5fbe869036cc3fe4a198f724/lxml-4.3.3-cp36-cp36m-manylinux1_x86_64.whl (5.7MB)  Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 9))  Downloading https://files.pythonhosted.org/packages/aa/e5/9ee1460d485b94a6d55732eb7ad5b6c084caf73dd6f9cb0bb7d2a78fafe8/cssselect-0.9.1.tar.gz  Collecting beautifulsoup4==4.7.1 (from -r /tmp/build/requirements.txt (line 10))  Downloading https://files.pythonhosted.org/packages/1d/5d/3260694a59df0ec52f8b4883f5d23b130bc237602a1411fa670eae12351e/beautifulsoup4-4.7.1-py3-none-any.whl (94kB)  Collecting requests==2.21.0 (from -r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl (57kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/15/27/3330a343de80d6849545b6c7723f8c9a08b4b104de964ac366e7e6b318df/dumptruck-0.1.6.tar.gz  Collecting soupsieve>=1.2 (from beautifulsoup4==4.7.1->-r /tmp/build/requirements.txt (line 10))  Downloading https://files.pythonhosted.org/packages/0b/44/0474f2207fdd601bb25787671c81076333d2c80e6f97e92790f8887cf682/soupsieve-1.9.3-py2.py3-none-any.whl  Collecting idna<2.9,>=2.5 (from requests==2.21.0->-r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl (58kB)  Collecting certifi>=2017.4.17 (from requests==2.21.0->-r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl (157kB)  Collecting urllib3<1.25,>=1.21.1 (from requests==2.21.0->-r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/01/11/525b02e4acc0c747de8b6ccdab376331597c569c42ea66ab0a1dbd36eca2/urllib3-1.24.3-py2.py3-none-any.whl (118kB)  Collecting chardet<3.1.0,>=3.0.2 (from requests==2.21.0->-r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133kB)  Building wheels for collected packages: cssselect, dumptruck  Building wheel for cssselect (setup.py): started  Building wheel for cssselect (setup.py): finished with status 'done'  Created wheel for cssselect: filename=cssselect-0.9.1-cp36-none-any.whl size=26992 sha256=8b81c786c9f2f771c3597504f4478dc312bcc07d85d2677df6bad62f2d79b2ff  Stored in directory: /tmp/pip-ephem-wheel-cache-x0b1_wxi/wheels/45/25/d7/5a3b06d22b1ffb616f868a74729a5a002bcc04d45109b4f223  Building wheel for dumptruck (setup.py): started  Building wheel for dumptruck (setup.py): finished with status 'done'  Created wheel for dumptruck: filename=dumptruck-0.1.6-cp36-none-any.whl size=11844 sha256=edf9f281b6ab560fea8f31197a0c700603ec449775379400e2b1eab5526bbd33  Stored in directory: /tmp/pip-ephem-wheel-cache-x0b1_wxi/wheels/57/df/83/32654ae89119876c7a7db66829bbdb646caa151589dbaf226e  Successfully built cssselect dumptruck  Installing collected packages: dumptruck, idna, certifi, urllib3, chardet, requests, scraperwiki, lxml, cssselect, soupsieve, beautifulsoup4  Running setup.py develop for scraperwiki  Successfully installed beautifulsoup4-4.7.1 certifi-2019.6.16 chardet-3.0.4 cssselect-0.9.1 dumptruck-0.1.6 idna-2.8 lxml-4.3.3 requests-2.21.0 scraperwiki soupsieve-1.9.3 urllib3-1.24.3   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... Traceback (most recent call last): File "scraper.py", line 63, in <module> main() File "scraper.py", line 42, in main w = div.split('Идет ')[1] IndexError: list index out of range

Data

Downloaded 6 times by Voknes

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (2 KB) Use the API

rows 1 / 1

week date
20 учебная неделя
23 июня

Statistics

Average successful run time: less than a minute

Total run time: about 1 hour

Total cpu time used: 1 minute

Total disk space used: 45.9 KB

History

  • Auto ran revision a09f2401 and failed .
    nothing changed in the database
  • Auto ran revision a09f2401 and failed .
    nothing changed in the database
  • Auto ran revision a09f2401 and failed .
    nothing changed in the database
  • Auto ran revision a09f2401 and failed .
    nothing changed in the database
  • Auto ran revision a09f2401 and failed .
    nothing changed in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

Python

week / scraper.py