otherchirps / orange_city_council

Orange City Council Development Applications

Scrapes ecouncil.orange.nsw.gov.au


Development applications for PlanningAlerts.

Contributors otherchirps

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-2.7.6  $ pip install -r requirements.txt  Collecting lxml==3.4.4 (from -r requirements.txt (line 1))  /app/.heroku/python/lib/python2.7/site-packages/pip-8.1.2-py2.7.egg/pip/_vendor/requests/packages/urllib3/util/ssl_.py:318: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#snimissingwarning.  SNIMissingWarning  /app/.heroku/python/lib/python2.7/site-packages/pip-8.1.2-py2.7.egg/pip/_vendor/requests/packages/urllib3/util/ssl_.py:122: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.  InsecurePlatformWarning  Downloading lxml-3.4.4.tar.gz (3.5MB)  Collecting scraperwiki==0.4.1 (from -r requirements.txt (line 2))  Downloading scraperwiki-0.4.1.tar.gz  Collecting Scrapy==0.24.6 (from -r requirements.txt (line 3))  Downloading Scrapy-0.24.6-py2-none-any.whl (444kB)  Collecting service-identity==14.0.0 (from -r requirements.txt (line 4))  Downloading service_identity-14.0.0-py2.py3-none-any.whl  Collecting requests (from scraperwiki==0.4.1->-r requirements.txt (line 2))  Downloading requests-2.11.1-py2.py3-none-any.whl (514kB)  Collecting sqlalchemy (from scraperwiki==0.4.1->-r requirements.txt (line 2))  Downloading SQLAlchemy-1.1.1.tar.gz (5.1MB)  Collecting alembic (from scraperwiki==0.4.1->-r requirements.txt (line 2))  Downloading alembic-0.8.8.tar.gz (970kB)  Collecting pyOpenSSL (from Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading pyOpenSSL-16.1.0-py2.py3-none-any.whl (43kB)  Collecting w3lib>=1.8.0 (from Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading w3lib-1.15.0-py2.py3-none-any.whl  Collecting Twisted>=10.0.0 (from Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading Twisted-16.4.1.tar.bz2 (3.0MB)  Collecting queuelib (from Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading queuelib-1.4.2-py2.py3-none-any.whl  Collecting six>=1.5.2 (from Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading six-1.10.0-py2.py3-none-any.whl  Collecting cssselect>=0.9 (from Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading cssselect-0.9.2-py2.py3-none-any.whl  Collecting pyasn1-modules (from service-identity==14.0.0->-r requirements.txt (line 4))  Downloading pyasn1_modules-0.0.8-py2.py3-none-any.whl  Collecting characteristic>=14.0.0 (from service-identity==14.0.0->-r requirements.txt (line 4))  Downloading characteristic-14.3.0-py2.py3-none-any.whl  Collecting pyasn1 (from service-identity==14.0.0->-r requirements.txt (line 4))  Downloading pyasn1-0.1.9-py2.py3-none-any.whl  Collecting Mako (from alembic->scraperwiki==0.4.1->-r requirements.txt (line 2))  Downloading Mako-1.0.4.tar.gz (574kB)  Collecting python-editor>=0.3 (from alembic->scraperwiki==0.4.1->-r requirements.txt (line 2))  Downloading python-editor-1.0.1.tar.gz  Collecting cryptography>=1.3.4 (from pyOpenSSL->Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading cryptography-1.5.2.tar.gz (400kB)  Collecting zope.interface>=3.6.0 (from Twisted>=10.0.0->Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading zope.interface-4.3.2.tar.gz (143kB)  Collecting MarkupSafe>=0.9.2 (from Mako->alembic->scraperwiki==0.4.1->-r requirements.txt (line 2))  Downloading MarkupSafe-0.23.tar.gz  Collecting idna>=2.0 (from cryptography>=1.3.4->pyOpenSSL->Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading idna-2.1-py2.py3-none-any.whl (54kB)  Collecting enum34 (from cryptography>=1.3.4->pyOpenSSL->Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading enum34-1.1.6-py2-none-any.whl  Collecting ipaddress (from cryptography>=1.3.4->pyOpenSSL->Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading ipaddress-1.0.17-py2-none-any.whl  Collecting cffi>=1.4.1 (from cryptography>=1.3.4->pyOpenSSL->Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading cffi-1.8.3-cp27-cp27m-manylinux1_x86_64.whl (388kB)  Collecting pycparser (from cffi>=1.4.1->cryptography>=1.3.4->pyOpenSSL->Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading pycparser-2.14.tar.gz (223kB)  Installing collected packages: lxml, requests, sqlalchemy, MarkupSafe, Mako, python-editor, alembic, scraperwiki, six, idna, pyasn1, enum34, ipaddress, pycparser, cffi, cryptography, pyOpenSSL, w3lib, zope.interface, Twisted, queuelib, cssselect, Scrapy, pyasn1-modules, characteristic, service-identity  Running setup.py install for lxml: started  Running setup.py install for lxml: still running...  Running setup.py install for lxml: finished with status 'done'  Running setup.py install for sqlalchemy: started  Running setup.py install for sqlalchemy: finished with status 'done'  Running setup.py install for MarkupSafe: started  Running setup.py install for MarkupSafe: finished with status 'done'  Running setup.py install for Mako: started  Running setup.py install for Mako: finished with status 'done'  Running setup.py install for python-editor: started  Running setup.py install for python-editor: finished with status 'done'  Running setup.py install for alembic: started  Running setup.py install for alembic: finished with status 'done'  Running setup.py install for scraperwiki: started  Running setup.py install for scraperwiki: finished with status 'done'  Running setup.py install for pycparser: started  Running setup.py install for pycparser: finished with status 'done'  Running setup.py install for cryptography: started  Running setup.py install for cryptography: finished with status 'done'  Running setup.py install for zope.interface: started  Running setup.py install for zope.interface: finished with status 'done'  Running setup.py install for Twisted: started  Running setup.py install for Twisted: finished with status 'done'  Successfully installed Mako-1.0.4 MarkupSafe-0.23 Scrapy-0.24.6 Twisted-16.4.1 alembic-0.8.8 cffi-1.8.3 characteristic-14.3.0 cryptography-1.5.2 cssselect-0.9.2 enum34-1.1.6 idna-2.1 ipaddress-1.0.17 lxml-3.4.4 pyOpenSSL-16.1.0 pyasn1-0.1.9 pyasn1-modules-0.0.8 pycparser-2.14 python-editor-1.0.1 queuelib-1.4.2 requests-2.11.1 scraperwiki-0.4.1 service-identity-14.0.0 six-1.10.0 sqlalchemy-1.1.1 w3lib-1.15.0 zope.interface-4.3.2   ! Hello! It looks like your application is using an outdated version of Python.  ! This caused the security warning you saw above during the 'pip install' step.  ! We recommend 'python-2.7.12', which you can specify in a 'runtime.txt' file.  ! -- Much Love, Heroku.   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... 2016-10-13 03:55:13+0000 [planningalerts] INFO: Searching for requests between 2016-10-01 and 2016-10-31 2016-10-13 03:55:21+0000 [planningalerts] INFO: Saved 346/2016 2016-10-13 03:55:21+0000 [planningalerts] INFO: Saved 335/2016 2016-10-13 03:55:21+0000 [planningalerts] INFO: Saved 263/2013 2016-10-13 03:55:21+0000 [planningalerts] INFO: Saved 86/2016 2016-10-13 03:55:21+0000 [planningalerts] INFO: Saved 344/2016 2016-10-13 03:55:21+0000 [planningalerts] INFO: Saved 336/2016 2016-10-13 03:55:21+0000 [planningalerts] INFO: Saved 347/2016 2016-10-13 03:55:21+0000 [planningalerts] INFO: Saved 338/2016 2016-10-13 03:55:23+0000 [planningalerts] INFO: Saved 345/2016 2016-10-13 03:55:23+0000 [planningalerts] INFO: Saved 340/2016 2016-10-13 03:55:24+0000 [planningalerts] INFO: Saved 343/2016 2016-10-13 03:55:24+0000 [planningalerts] INFO: Saved 348/2016 2016-10-13 03:55:24+0000 [planningalerts] INFO: Saved 341/2016 2016-10-13 03:55:24+0000 [planningalerts] INFO: Saved 339/2016 2016-10-13 03:55:24+0000 [planningalerts] INFO: Saved 342/2016 2016-10-13 03:55:24+0000 [planningalerts] INFO: Saved 349/2016 2016-10-13 03:55:25+0000 [planningalerts] INFO: Saved 337/2016 2016-10-13 03:55:25+0000 [planningalerts] INFO: Closing spider (finished) 2016-10-13 03:55:25+0000 [planningalerts] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 11909, 'downloader/request_count': 19, 'downloader/request_method_count/GET': 19, 'downloader/response_bytes': 380253, 'downloader/response_count': 19, 'downloader/response_status_count/200': 19, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2016, 10, 13, 3, 55, 25, 313361), 'item_scraped_count': 17, 'memdebug/gc_garbage_count': 0, 'memdebug/live_refs/PlanningalertsSpider': 1, 'memusage/max': 50991104, 'memusage/startup': 50991104, 'request_depth_max': 2, 'response_received_count': 19, 'scheduler/dequeued': 19, 'scheduler/dequeued/memory': 19, 'scheduler/enqueued': 19, 'scheduler/enqueued/memory': 19, 'start_time': datetime.datetime(2016, 10, 13, 3, 55, 11, 943008)} 2016-10-13 03:55:25+0000 [planningalerts] INFO: Spider closed (finished)

Data

Downloaded 2 times by henare MikeRalphson

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (41 KB) Use the API

rows 10 / 106

info_url external_reference comment_url date_received description date_scraped address council_reference
PR19429
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20188/2015
2015-06-09
Dwelling and Attached Garage
2015-06-14
24 Nicholls Lane HUNTLEY NSW 2800
188/2015
PR20497
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20501/2007
2015-06-09
Housing for Seniors or People With a Disability (including a community centre, an indoor pool and a bowling green)
2015-06-14
109 Ploughmans Lane ORANGE NSW 2800
501/2007
PR8762
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20192/2015
2015-06-10
Subdivision (two lot residential)
2015-06-14
152 Moulder Street ORANGE NSW 280080 Wade Place ORANGE NSW 2800
192/2015
PR11066
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20186/2015
2015-06-03
Dwelling Alterations & Additions (carport)
2015-06-14
106 Sieben Drive ORANGE NSW 2800
186/2015
PR26881
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20182/2015
2015-06-02
Dwelling and Attached Garage
2015-06-14
19 Taloumbi Place ORANGE NSW 2800
182/2015
PR7592
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20193/2015
2015-06-12
Garages/Outbuildings
2015-06-14
96 March Street ORANGE NSW 2800
193/2015
PR26767
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20194/2015
2015-06-12
Garages/Outbuildings
2015-06-14
62 Valencia Drive ORANGE NSW 2800
194/2015
PR12317
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20187/2015
2015-06-04
Shed
2015-06-14
19 Wallace Lane CANOBOLAS NSW 2800
187/2015
PR26705
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20183/2015
2015-06-03
Dwelling and Attached Garage
2015-06-14
11 Japonica Place ORANGE NSW 2800
183/2015
PR26739
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20191/2015
2015-06-10
Subdivision (two lot residential) and Dwelling Houses (two)
2015-06-14
29 Dimboola Way ORANGE NSW 2800
191/2015

Statistics

Average successful run time: 2 minutes

Total run time: 18 minutes

Total cpu time used: half a minute

Total disk space used: 72.9 KB

History

  • Manually ran revision ba090094 and completed successfully .
    17 records added in the database
    17 pages scraped
  • Manually ran revision ba090094 and completed successfully .
    nothing changed in the database
    1 page scraped
  • Manually ran revision ba090094 and completed successfully .
    30 records added in the database
  • Manually ran revision ba090094 and completed successfully .
    3 records added in the database
    41 pages scraped
  • Manually ran revision ba090094 and completed successfully .
    nothing changed in the database
    38 pages scraped
  • Manually ran revision ba090094 and completed successfully .
    36 records added in the database
    38 pages scraped
  • Auto ran revision ba090094 and completed successfully .
    1 record added in the database
    22 pages scraped
  • Manually ran revision ba090094 and completed successfully .
    19 records added in the database
    21 pages scraped
  • Auto ran revision 1af813d3 and completed successfully .
    nothing changed in the database
    21 pages scraped
  • Manually ran revision 1af813d3 and completed successfully .
    19 records added in the database
    21 pages scraped
  • Created on morph.io

Scraper code

Python

orange_city_council / scraper.py