residuum / premier_league_table

Downloads the data from the premier league on a daily basis.


This is a scraper that runs on Morph. To get started see the documentation.

This scraper downloads the current table of the English Football Premier League and stores it in a database. The data is then accessed via a JSON API for sonification with libPd.

The table is output as an array with each team being an object with the following format:

{
  pos: <integer, position 1-20>,
  team: <string, team name>,
  pts: <integer, points>,
  gf: <integer, goals for>,
  ga: <integer, goals against>,
  gd: <integer, goal difference>
}

Contributors residuum

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected  ! The latest version of Python 2 is python-2.7.14 (you are using python-2.7.9, which is unsupported).  ! We recommend upgrading by specifying the latest version (python-2.7.14).  Learn More: https://devcenter.heroku.com/articles/python-runtimes -----> Installing python-2.7.9 -----> Installing pip -----> Installing requirements with pip  DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. pip 21.0 will drop support for Python 2.7 in January 2021. More details about Python 2 support in pip can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support pip 21.0 will remove support for this functionality.  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to revision morph_defaults) to /app/.heroku/src/scraperwiki  Running command git clone -q http://github.com/openaustralia/scraperwiki-python.git /app/.heroku/src/scraperwiki  Running command git checkout -b morph_defaults --track origin/morph_defaults  Switched to a new branch 'morph_defaults'  Branch morph_defaults set up to track remote branch morph_defaults from origin.  Collecting lxml==3.4.4  Downloading lxml-3.4.4.tar.gz (3.5 MB)  Collecting cssselect==0.9.1  Downloading cssselect-0.9.1.tar.gz (32 kB)  Collecting requests  Downloading requests-2.27.1-py2.py3-none-any.whl (63 kB)  Collecting dumptruck>=0.1.2  Downloading dumptruck-0.1.6.tar.gz (15 kB)  Collecting idna<3,>=2.5; python_version < "3"  Downloading idna-2.10-py2.py3-none-any.whl (58 kB)  Collecting certifi>=2017.4.17  Downloading certifi-2021.10.8-py2.py3-none-any.whl (149 kB)  Collecting chardet<5,>=3.0.2; python_version < "3"  Downloading chardet-4.0.0-py2.py3-none-any.whl (178 kB)  Collecting urllib3<1.27,>=1.21.1  Downloading urllib3-1.26.9-py2.py3-none-any.whl (138 kB)  Building wheels for collected packages: lxml, cssselect, dumptruck  Building wheel for lxml (setup.py): started  Building wheel for lxml (setup.py): still running...  Building wheel for lxml (setup.py): finished with status 'done'  Created wheel for lxml: filename=lxml-3.4.4-cp27-cp27m-linux_x86_64.whl size=2989879 sha256=e267a3753413b88e5e647aa1d277faddcdb0af980d0ea5739c84ac4a154043bb  Stored in directory: /tmp/pip-ephem-wheel-cache-42jyMW/wheels/d6/de/81/11ae6edd05c75aac677e67dd154c85da758ba6f3e8e80e962e  Building wheel for cssselect (setup.py): started  Building wheel for cssselect (setup.py): finished with status 'done'  Created wheel for cssselect: filename=cssselect-0.9.1-py2-none-any.whl size=26992 sha256=44e8e2bb6935896a6405f173fed2cb2e11d2150fb4958c3c29893cae3534f827  Stored in directory: /tmp/pip-ephem-wheel-cache-42jyMW/wheels/85/fe/00/b94036d8583cec9791d8cda24c184f2d2ac1397822f7f0e8d4  Building wheel for dumptruck (setup.py): started  Building wheel for dumptruck (setup.py): finished with status 'done'  Created wheel for dumptruck: filename=dumptruck-0.1.6-py2-none-any.whl size=11844 sha256=1753dd862a558e879bcbd3530282dc314236b8ae0a0ba40e85d82738456fe849  Stored in directory: /tmp/pip-ephem-wheel-cache-42jyMW/wheels/dc/75/e9/1e61c4080c73e7bda99614549591f83b53bcc2d682f26fce62  Successfully built lxml cssselect dumptruck  Installing collected packages: dumptruck, idna, certifi, chardet, urllib3, requests, scraperwiki, lxml, cssselect  Running setup.py develop for scraperwiki  Successfully installed certifi-2021.10.8 chardet-4.0.0 cssselect-0.9.1 dumptruck-0.1.6 idna-2.10 lxml-3.4.4 requests-2.27.1 scraperwiki urllib3-1.26.9 DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. pip 21.0 will drop support for Python 2.7 in January 2021. More details about Python 2 support in pip can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support pip 21.0 will remove support for this functionality.    -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running...

Data

Downloaded 576 times by residuum MikeRalphson

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (3 KB) Use the API

rows 10 / 20

pos ga gf gd pts team
1
24
96
72
90
Manchester City
2
25
91
66
89
Liverpool
3
31
73
42
70
Chelsea
4
40
64
24
68
Tottenham Hotspur
5
47
56
9
66
Arsenal
6
56
57
1
58
Manchester United
7
48
59
11
56
West Ham United
8
40
37
-3
51
Wolverhampton Wanderers
9
57
57
0
48
Leicester City
10
43
39
-4
48
Brighton and Hove Albion

Statistics

Average successful run time: 2 minutes

Total run time: about 1 month

Total cpu time used: 22 minutes

Total disk space used: 28.1 KB

History

  • Auto ran revision 43e89d05 and completed successfully .
    20 records updated in the database
  • Auto ran revision 43e89d05 and completed successfully .
    20 records updated in the database
  • Auto ran revision 43e89d05 and completed successfully .
    20 records updated in the database
  • Auto ran revision 43e89d05 and completed successfully .
    20 records updated in the database
  • Auto ran revision 43e89d05 and completed successfully .
    20 records updated in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

Python

premier_league_table / scraper.py