Contributors blablupcom woodbine

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-2.7.14 -----> Installing pip -----> Installing requirements with pip  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 1))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to revision morph_defaults) to /app/.heroku/src/scraperwiki  Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 2))  Downloading https://files.pythonhosted.org/packages/63/c7/4f2a2a4ad6c6fa99b14be6b3c1cece9142e2d915aa7c43c908677afc8fa4/lxml-3.4.4.tar.gz (3.5MB)  Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 3))  Downloading https://files.pythonhosted.org/packages/aa/e5/9ee1460d485b94a6d55732eb7ad5b6c084caf73dd6f9cb0bb7d2a78fafe8/cssselect-0.9.1.tar.gz  Collecting beautifulsoup4 (from -r /tmp/build/requirements.txt (line 4))  Downloading https://files.pythonhosted.org/packages/a6/29/bcbd41a916ad3faf517780a0af7d0254e8d6722ff6414723eedba4334531/beautifulsoup4-4.6.0-py2-none-any.whl (86kB)  Collecting python-dateutil (from -r /tmp/build/requirements.txt (line 5))  Downloading https://files.pythonhosted.org/packages/0c/57/19f3a65bcf6d5be570ee8c35a5398496e10a0ddcbc95393b2d17f86aaaf8/python_dateutil-2.7.2-py2.py3-none-any.whl (212kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/15/27/3330a343de80d6849545b6c7723f8c9a08b4b104de964ac366e7e6b318df/dumptruck-0.1.6.tar.gz  Collecting requests (from scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/49/df/50aa1999ab9bde74656c2919d9c0c085fd2b3775fd3eca826012bef76d8c/requests-2.18.4-py2.py3-none-any.whl (88kB)  Collecting six>=1.5 (from python-dateutil->-r /tmp/build/requirements.txt (line 5))  Downloading https://files.pythonhosted.org/packages/67/4b/141a581104b1f6397bfa78ac9d43d8ad29a7ca43ea90a2d863fe3056e86a/six-1.11.0-py2.py3-none-any.whl  Collecting idna<2.7,>=2.5 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/27/cc/6dd9a3869f15c2edfab863b992838277279ce92663d334df9ecf5106f5c6/idna-2.6-py2.py3-none-any.whl (56kB)  Collecting urllib3<1.23,>=1.21.1 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/63/cb/6965947c13a94236f6d4b8223e21beb4d576dc72e8130bd7880f600839b8/urllib3-1.22-py2.py3-none-any.whl (132kB)  Collecting certifi>=2017.4.17 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/7c/e6/92ad559b7192d846975fc916b65f667c7b8c3a32bea7372340bfe9a15fa5/certifi-2018.4.16-py2.py3-none-any.whl (150kB)  Collecting chardet<3.1.0,>=3.0.2 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133kB)  Installing collected packages: dumptruck, idna, urllib3, certifi, chardet, requests, scraperwiki, lxml, cssselect, beautifulsoup4, six, python-dateutil  Running setup.py install for dumptruck: started  Running setup.py install for dumptruck: finished with status 'done'  Running setup.py develop for scraperwiki  Running setup.py install for lxml: started  Running setup.py install for lxml: still running...  Running setup.py install for lxml: finished with status 'done'  Running setup.py install for cssselect: started  Running setup.py install for cssselect: finished with status 'done'  Successfully installed beautifulsoup4-4.6.0 certifi-2018.4.16 chardet-3.0.4 cssselect-0.9.1 dumptruck-0.1.6 idna-2.6 lxml-3.4.4 python-dateutil-2.7.2 requests-2.18.4 scraperwiki six-1.11.0 urllib3-1.22   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... E1702_SCC_gov_2017_Q3 E1702_SCC_gov_2017_Q2 E1702_SCC_gov_2017_Q1 E1702_SCC_gov_2016_Q4 E1702_SCC_gov_2016_Q3 E1702_SCC_gov_2016_Q2 E1702_SCC_gov_2016_Q1 E1702_SCC_gov_2015_Q4 E1702_SCC_gov_2015_Q3 E1702_SCC_gov_2015_Q2 E1702_SCC_gov_2015_Q1 E1702_SCC_gov_2014_Q4 E1702_SCC_gov_2014_Q3 E1702_SCC_gov_2014_Q2

Data

Downloaded 510 times by SimKennedy woodbine MikeRalphson

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (14 KB) Use the API

rows 10 / 27

f d l
E1702_SCC_gov_2015_Q3
2016-01-28 02:26:08.022311
E1702_SCC_gov_2015_Q2
2016-01-28 02:26:17.243666
E1702_SCC_gov_2015_Q1
2016-01-28 02:26:24.647569
E1702_SCC_gov_2014_Q4
2016-01-28 02:26:32.705540
E1702_SCC_gov_2014_Q3
2016-01-28 02:26:43.261093
E1702_SCC_gov_2014_Q2
2016-01-28 02:26:50.392057
E1702_SCC_gov_2014_Q1
2016-01-28 02:26:58.255533
E1702_SCC_gov_2013_Q4
2016-01-28 02:27:06.235025
E1702_SCC_gov_2013_Q3
2016-01-28 02:27:13.586007
E1702_SCC_gov_2014_Q1
2017-04-27 08:37:54.759677

Statistics

Average successful run time: 3 minutes

Total run time: 8 days

Total cpu time used: 9 minutes

Total disk space used: 47.9 KB

History

  • Auto ran revision 87c9561a and completed successfully .
    14 records added, 14 records removed in the database
  • Auto ran revision 87c9561a and failed .
    6 records added, 6 records removed in the database
  • Auto ran revision 87c9561a and completed successfully .
    14 records added, 14 records removed in the database
    30 pages scraped
  • Auto ran revision 87c9561a and completed successfully .
    14 records added, 14 records removed in the database
  • Auto ran revision 87c9561a and completed successfully .
    14 records added, 14 records removed in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

Python

sp_E1702_SCC_gov / scraper.py