Contributors blablupcom woodbine

Last run failed with status code 1.

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-2.7.14 -----> Installing pip -----> Installing requirements with pip  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 1))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to revision morph_defaults) to /app/.heroku/src/scraperwiki  Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 2))  Downloading https://files.pythonhosted.org/packages/63/c7/4f2a2a4ad6c6fa99b14be6b3c1cece9142e2d915aa7c43c908677afc8fa4/lxml-3.4.4.tar.gz (3.5MB)  Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 3))  Downloading https://files.pythonhosted.org/packages/aa/e5/9ee1460d485b94a6d55732eb7ad5b6c084caf73dd6f9cb0bb7d2a78fafe8/cssselect-0.9.1.tar.gz  Collecting beautifulsoup4 (from -r /tmp/build/requirements.txt (line 4))  Downloading https://files.pythonhosted.org/packages/a6/29/bcbd41a916ad3faf517780a0af7d0254e8d6722ff6414723eedba4334531/beautifulsoup4-4.6.0-py2-none-any.whl (86kB)  Collecting python-dateutil (from -r /tmp/build/requirements.txt (line 5))  Downloading https://files.pythonhosted.org/packages/cf/f5/af2b09c957ace60dcfac112b669c45c8c97e32f94aa8b56da4c6d1682825/python_dateutil-2.7.3-py2.py3-none-any.whl (211kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/15/27/3330a343de80d6849545b6c7723f8c9a08b4b104de964ac366e7e6b318df/dumptruck-0.1.6.tar.gz  Collecting requests (from scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/49/df/50aa1999ab9bde74656c2919d9c0c085fd2b3775fd3eca826012bef76d8c/requests-2.18.4-py2.py3-none-any.whl (88kB)  Collecting six>=1.5 (from python-dateutil->-r /tmp/build/requirements.txt (line 5))  Downloading https://files.pythonhosted.org/packages/67/4b/141a581104b1f6397bfa78ac9d43d8ad29a7ca43ea90a2d863fe3056e86a/six-1.11.0-py2.py3-none-any.whl  Collecting idna<2.7,>=2.5 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/27/cc/6dd9a3869f15c2edfab863b992838277279ce92663d334df9ecf5106f5c6/idna-2.6-py2.py3-none-any.whl (56kB)  Collecting urllib3<1.23,>=1.21.1 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/63/cb/6965947c13a94236f6d4b8223e21beb4d576dc72e8130bd7880f600839b8/urllib3-1.22-py2.py3-none-any.whl (132kB)  Collecting certifi>=2017.4.17 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/7c/e6/92ad559b7192d846975fc916b65f667c7b8c3a32bea7372340bfe9a15fa5/certifi-2018.4.16-py2.py3-none-any.whl (150kB)  Collecting chardet<3.1.0,>=3.0.2 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133kB)  Installing collected packages: dumptruck, idna, urllib3, certifi, chardet, requests, scraperwiki, lxml, cssselect, beautifulsoup4, six, python-dateutil  Running setup.py install for dumptruck: started  Running setup.py install for dumptruck: finished with status 'done'  Running setup.py develop for scraperwiki  Running setup.py install for lxml: started  Running setup.py install for lxml: still running...  Running setup.py install for lxml: finished with status 'done'  Running setup.py install for cssselect: started  Running setup.py install for cssselect: finished with status 'done'  Successfully installed beautifulsoup4-4.6.0 certifi-2018.4.16 chardet-3.0.4 cssselect-0.9.1 dumptruck-0.1.6 idna-2.6 lxml-3.4.4 python-dateutil-2.7.3 requests-2.18.4 scraperwiki six-1.11.0 urllib3-1.22   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... E2302_BBC_gov_2018_04 E2302_BBC_gov_2018_03 E2302_BBC_gov_2018_02 E2302_BBC_gov_2018_01 E2302_BBC_gov_2017_12 E2302_BBC_gov_2017_11 E2302_BBC_gov_2017_10 E2302_BBC_gov_2017_09 E2302_BBC_gov_2017_08 E2302_BBC_gov_2017_07 E2302_BBC_gov_2017_06 E2302_BBC_gov_2017_05 E2302_BBC_gov_2017_04 E2302_BBC_gov_2017_03 E2302_BBC_gov_2017_02 E2302_BBC_gov_2017_01 E2302_BBC_gov_2016_12 E2302_BBC_gov_2016_12 E2302_BBC_gov_2016_11 E2302_BBC_gov_2016_10 E2302_BBC_gov_2016_09 E2302_BBC_gov_2016_08 E2302_BBC_gov_2016_07 E2302_BBC_gov_2016_06 E2302_BBC_gov_2016_05 E2302_BBC_gov_2016_04 E2302_BBC_gov_2016_03 E2302_BBC_gov_2016_02 E2302_BBC_gov_2016_01 E2302_BBC_gov_2015_12 E2302_BBC_gov_2015_11 E2302_BBC_gov_2015_10 E2302_BBC_gov_2015_09 E2302_BBC_gov_2015_08 E2302_BBC_gov_2015_07 E2302_BBC_gov_2015_06 E2302_BBC_gov_2015_05 E2302_BBC_gov_2015_04 Traceback (most recent call last): File "scraper.py", line 137, in <module> valid = validate(filename, file_url) File "scraper.py", line 65, in validate print filename, "*Error: Invalid filename*" UnicodeEncodeError: 'ascii' codec can't encode character u'\xa0' in position 19: ordinal not in range(128)

Data

Downloaded 507 times by SimKennedy MikeRalphson woodbine

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (34 KB) Use the API

rows 10 / 74

d l f
2016-03-07 11:25:29.634654
E2302_BBC_gov_2014_03
2016-03-07 11:25:32.862584
E2302_BBC_gov_2014_02
2016-03-07 11:25:35.670537
E2302_BBC_gov_2014_01
2016-03-07 11:25:38.489633
E2302_BBC_gov_2013_12
2016-03-07 11:25:41.421019
E2302_BBC_gov_2013_11
2016-03-07 11:25:44.339154
E2302_BBC_gov_2013_10
2016-03-07 11:25:47.247895
E2302_BBC_gov_2013_09
2016-03-07 11:25:50.161141
E2302_BBC_gov_2013_08
2016-03-07 11:25:53.078121
E2302_BBC_gov_2013_07
2016-03-07 11:25:58.279677
E2302_BBC_gov_2013_06

Statistics

Average successful run time: 4 minutes

Total run time: 2 days

Total cpu time used: 42 minutes

Total disk space used: 64.7 KB

History

  • Auto ran revision 80855b08 and failed .
    38 records added, 38 records removed in the database
    42 pages scraped
  • Auto ran revision 80855b08 and failed .
    38 records added, 38 records removed in the database
  • Auto ran revision 80855b08 and failed .
    38 records added, 38 records removed in the database
    43 pages scraped
  • Auto ran revision 80855b08 and failed .
    38 records added, 38 records removed in the database
  • Auto ran revision 80855b08 and failed .
    38 records added, 38 records removed in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

Python

sp_E2302_BBC_gov / scraper.py