leafarenuk / fussballgucken_today

todays soccer games corresponding tv channels


This is a scraper that runs on Morph. To get started see the documentation

Contributors leafarenuk

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-2.7.14 -----> Installing pip -----> Installing requirements with pip  DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to revision morph_defaults) to /app/.heroku/src/scraperwiki  Running command git clone -q http://github.com/openaustralia/scraperwiki-python.git /app/.heroku/src/scraperwiki  Running command git checkout -b morph_defaults --track origin/morph_defaults  Switched to a new branch 'morph_defaults'  Branch morph_defaults set up to track remote branch morph_defaults from origin.  Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 8))  Downloading https://files.pythonhosted.org/packages/63/c7/4f2a2a4ad6c6fa99b14be6b3c1cece9142e2d915aa7c43c908677afc8fa4/lxml-3.4.4.tar.gz (3.5MB)  Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 9))  Downloading https://files.pythonhosted.org/packages/aa/e5/9ee1460d485b94a6d55732eb7ad5b6c084caf73dd6f9cb0bb7d2a78fafe8/cssselect-0.9.1.tar.gz  Collecting requests (from -r /tmp/build/requirements.txt (line 10))  Downloading https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl (57kB)  Collecting bs4 (from -r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/10/ed/7e8b97591f6f456174139ec089c769f89a94a1a4025fe967691de971f314/bs4-0.0.1.tar.gz  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/15/27/3330a343de80d6849545b6c7723f8c9a08b4b104de964ac366e7e6b318df/dumptruck-0.1.6.tar.gz  Collecting certifi>=2017.4.17 (from requests->-r /tmp/build/requirements.txt (line 10))  Downloading https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl (157kB)  Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests->-r /tmp/build/requirements.txt (line 10))  Downloading https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl (150kB)  Collecting idna<2.9,>=2.5 (from requests->-r /tmp/build/requirements.txt (line 10))  Downloading https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl (58kB)  Collecting chardet<3.1.0,>=3.0.2 (from requests->-r /tmp/build/requirements.txt (line 10))  Downloading https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133kB)  Collecting beautifulsoup4 (from bs4->-r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/39/0e/cfae701dc1143409adf1dc78ebc138f7c507e342e5814b1ead2a727bc900/beautifulsoup4-4.8.0-py2-none-any.whl (97kB)  Collecting soupsieve>=1.2 (from beautifulsoup4->bs4->-r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/0b/44/0474f2207fdd601bb25787671c81076333d2c80e6f97e92790f8887cf682/soupsieve-1.9.3-py2.py3-none-any.whl  Collecting backports.functools-lru-cache; python_version < "3" (from soupsieve>=1.2->beautifulsoup4->bs4->-r /tmp/build/requirements.txt (line 11))  Downloading https://files.pythonhosted.org/packages/03/8e/2424c0e65c4a066e28f539364deee49b6451f8fcd4f718fefa50cc3dcf48/backports.functools_lru_cache-1.5-py2.py3-none-any.whl  Building wheels for collected packages: lxml, cssselect, bs4, dumptruck  Building wheel for lxml (setup.py): started  Building wheel for lxml (setup.py): still running...  Building wheel for lxml (setup.py): finished with status 'done'  Created wheel for lxml: filename=lxml-3.4.4-cp27-cp27mu-linux_x86_64.whl size=2987169 sha256=448a10dd3dc3e6ee757829bb7a224d5235023fa93eac8ee63dfc39769fcd4099  Stored in directory: /tmp/pip-ephem-wheel-cache-YhK3CS/wheels/f6/df/7b/af9cace9baf95a6e4a2b5790e30da55fc780ddee598314d1ed  Building wheel for cssselect (setup.py): started  Building wheel for cssselect (setup.py): finished with status 'done'  Created wheel for cssselect: filename=cssselect-0.9.1-cp27-none-any.whl size=26994 sha256=29690b0cf5e1ce4afda8492ac0cfebd1aab83c05db9cc317280c5abafa2686a6  Stored in directory: /tmp/pip-ephem-wheel-cache-YhK3CS/wheels/45/25/d7/5a3b06d22b1ffb616f868a74729a5a002bcc04d45109b4f223  Building wheel for bs4 (setup.py): started  Building wheel for bs4 (setup.py): finished with status 'done'  Created wheel for bs4: filename=bs4-0.0.1-cp27-none-any.whl size=1273 sha256=893a826749b4772ec75c9d903807f4f56c966ee517a5b1ba4b547e39ec9a8b11  Stored in directory: /tmp/pip-ephem-wheel-cache-YhK3CS/wheels/a0/b0/b2/4f80b9456b87abedbc0bf2d52235414c3467d8889be38dd472  Building wheel for dumptruck (setup.py): started  Building wheel for dumptruck (setup.py): finished with status 'done'  Created wheel for dumptruck: filename=dumptruck-0.1.6-cp27-none-any.whl size=11845 sha256=ad7d58c12019d087336af04990c60cdb1816988685ce80a48456d6733ecde64e  Stored in directory: /tmp/pip-ephem-wheel-cache-YhK3CS/wheels/57/df/83/32654ae89119876c7a7db66829bbdb646caa151589dbaf226e  Successfully built lxml cssselect bs4 dumptruck  Installing collected packages: dumptruck, certifi, urllib3, idna, chardet, requests, scraperwiki, lxml, cssselect, backports.functools-lru-cache, soupsieve, beautifulsoup4, bs4  Running setup.py develop for scraperwiki  Successfully installed backports.functools-lru-cache-1.5 beautifulsoup4-4.8.0 bs4-0.0.1 certifi-2019.6.16 chardet-3.0.4 cssselect-0.9.1 dumptruck-0.1.6 idna-2.8 lxml-3.4.4 requests-2.22.0 scraperwiki soupsieve-1.9.3 urllib3-1.25.3 DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support    -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running...

Data

Downloaded 5739 times by leafarenuk

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (15 KB) Use the API

rows 10 / 98

home_team date coverages away_team leagues kickoff
Orlando City SC
   DAZN   DAZN
Atlanta United
Major League Soccer (MLS)
02:00
Portland Timbers
   DAZN   DAZN
Seattle Sounders
Major League Soccer (MLS)
04:00
FK Pardubice
   LAOLA1.tv   
FK Banik Sokolov
Tschechische 2. Liga
10:15
VFC Plauen
   Sporttotal.t
VfL 05 Hohenstein-Ernstthal
Oberliga Süd (NOFV)
13:00
VfL Wolfsburg II
   Sporttotal.t
Eintracht Norderstedt
Regionalliga Nord
13:00
VfL Bochum
   Sky Sport Bu
SV Wehen Wiesbaden
2. Bundesliga
13:00
SSV Jahn Regensburg
   Sky Sport Bu
Arminia Bielefeld
2. Bundesliga
13:00
Hannover 96
   Sky Sport Bu
SpVgg Greuther Fürth
2. Bundesliga
13:00
Derby County
   SPORT1+   SP
West Bromwich Albion
Sky Bet Championship
13:30
Norwich City
   RMC Sport Ac
FC Chelsea
Premier League
13:30

Statistics

Average successful run time: 1 minute

Total run time: 28 minutes

Total cpu time used: less than a minute

Total disk space used: 53.3 KB

History

  • Auto ran revision 7061c0cf and completed successfully .
    71 records added, 27 records updated in the database
  • Auto ran revision 7061c0cf and completed successfully .
    19 records added, 8 records updated in the database
  • Auto ran revision 7061c0cf and completed successfully .
    4 records added, 4 records updated in the database
  • Auto ran revision 7061c0cf and completed successfully .
    4 records updated in the database
  • Auto ran revision 7061c0cf and completed successfully .
    79 records removed, 4 records updated in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

Python

fussballgucken_today / scraper.py