Contributors blablupcom woodbine

The scraper is running. It was queued automatically .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-2.7.14 -----> Installing pip -----> Installing requirements with pip  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 1))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to revision morph_defaults) to /app/.heroku/src/scraperwiki  Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 2))  Downloading https://files.pythonhosted.org/packages/63/c7/4f2a2a4ad6c6fa99b14be6b3c1cece9142e2d915aa7c43c908677afc8fa4/lxml-3.4.4.tar.gz (3.5MB)  Collecting cssselect==0.9.1 (from -r /tmp/build/requirements.txt (line 3))  Downloading https://files.pythonhosted.org/packages/aa/e5/9ee1460d485b94a6d55732eb7ad5b6c084caf73dd6f9cb0bb7d2a78fafe8/cssselect-0.9.1.tar.gz  Collecting beautifulsoup4 (from -r /tmp/build/requirements.txt (line 4))  Downloading https://files.pythonhosted.org/packages/a6/29/bcbd41a916ad3faf517780a0af7d0254e8d6722ff6414723eedba4334531/beautifulsoup4-4.6.0-py2-none-any.whl (86kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/15/27/3330a343de80d6849545b6c7723f8c9a08b4b104de964ac366e7e6b318df/dumptruck-0.1.6.tar.gz  Collecting requests (from scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/49/df/50aa1999ab9bde74656c2919d9c0c085fd2b3775fd3eca826012bef76d8c/requests-2.18.4-py2.py3-none-any.whl (88kB)  Collecting idna<2.7,>=2.5 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/27/cc/6dd9a3869f15c2edfab863b992838277279ce92663d334df9ecf5106f5c6/idna-2.6-py2.py3-none-any.whl (56kB)  Collecting urllib3<1.23,>=1.21.1 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/63/cb/6965947c13a94236f6d4b8223e21beb4d576dc72e8130bd7880f600839b8/urllib3-1.22-py2.py3-none-any.whl (132kB)  Collecting certifi>=2017.4.17 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/7c/e6/92ad559b7192d846975fc916b65f667c7b8c3a32bea7372340bfe9a15fa5/certifi-2018.4.16-py2.py3-none-any.whl (150kB)  Collecting chardet<3.1.0,>=3.0.2 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 1))  Downloading https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133kB)  Installing collected packages: dumptruck, idna, urllib3, certifi, chardet, requests, scraperwiki, lxml, cssselect, beautifulsoup4  Running setup.py install for dumptruck: started  Running setup.py install for dumptruck: finished with status 'done'  Running setup.py develop for scraperwiki  Running setup.py install for lxml: started

Data

Downloaded 529 times by SimKennedy MikeRalphson woodbine

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (22 KB) Use the API

rows 10 / 78

l d f
2017-04-24 12:32:32.191792
NHTRX5NFT_GWASNT_gov_2012_01
2017-04-24 12:32:33.191431
NHTRX5NFT_GWASNT_gov_2012_02
2017-04-24 12:32:34.192912
NHTRX5NFT_GWASNT_gov_2012_03
2017-04-24 12:32:35.251087
NHTRX5NFT_GWASNT_gov_2012_04
2017-04-24 12:32:36.444517
NHTRX5NFT_GWASNT_gov_2012_05
2017-04-24 12:32:37.400655
NHTRX5NFT_GWASNT_gov_2012_06
2017-04-24 12:32:38.428027
NHTRX5NFT_GWASNT_gov_2012_07
2017-04-24 12:32:39.441125
NHTRX5NFT_GWASNT_gov_2012_08
2017-04-24 12:32:40.643104
NHTRX5NFT_GWASNT_gov_2012_09
2017-04-24 12:32:42.695532
NHTRX5NFT_GWASNT_gov_2012_11

Statistics

Average successful run time: 5 minutes

Total run time: 1 day

Total cpu time used: 8 minutes

Total disk space used: 51.2 KB

History

  • Auto ran revision 6fa7e410 and failed .
    50 records added, 50 records removed in the database
    163 pages scraped
  • Auto ran revision 6fa7e410 and failed .
    50 records added, 50 records removed in the database
    163 pages scraped
  • Auto ran revision 6fa7e410 and failed .
    50 records added, 50 records removed in the database
  • Auto ran revision 6fa7e410 and failed .
    50 records added, 50 records removed in the database
    163 pages scraped
  • Auto ran revision 6fa7e410 and failed .
    50 records added, 50 records removed in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

Python

sp_NHTRX5NFT_GWASNT_gov / scraper.py