otherchirps / orange_city_council

Orange City Council Development Applications


Development applications for PlanningAlerts.

Contributors otherchirps

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-2.7.6  $ pip install -r requirements.txt  Collecting lxml==3.4.4 (from -r requirements.txt (line 1))  /app/.heroku/python/lib/python2.7/site-packages/pip-8.1.2-py2.7.egg/pip/_vendor/requests/packages/urllib3/util/ssl_.py:318: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#snimissingwarning.  SNIMissingWarning  /app/.heroku/python/lib/python2.7/site-packages/pip-8.1.2-py2.7.egg/pip/_vendor/requests/packages/urllib3/util/ssl_.py:122: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.  InsecurePlatformWarning  Downloading lxml-3.4.4.tar.gz (3.5MB)  Collecting scraperwiki==0.4.1 (from -r requirements.txt (line 2))  Downloading scraperwiki-0.4.1.tar.gz  Collecting Scrapy==0.24.6 (from -r requirements.txt (line 3))  Downloading Scrapy-0.24.6-py2-none-any.whl (444kB)  Collecting service-identity==14.0.0 (from -r requirements.txt (line 4))  Downloading service_identity-14.0.0-py2.py3-none-any.whl  Collecting requests (from scraperwiki==0.4.1->-r requirements.txt (line 2))  Downloading requests-2.13.0-py2.py3-none-any.whl (584kB)  Collecting sqlalchemy (from scraperwiki==0.4.1->-r requirements.txt (line 2))  Downloading SQLAlchemy-1.1.6.tar.gz (5.2MB)  Collecting alembic (from scraperwiki==0.4.1->-r requirements.txt (line 2))  Downloading alembic-0.9.0.tar.gz (998kB)  Collecting cssselect>=0.9 (from Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading cssselect-1.0.1-py2.py3-none-any.whl  Collecting w3lib>=1.8.0 (from Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading w3lib-1.17.0-py2.py3-none-any.whl  Collecting Twisted>=10.0.0 (from Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading Twisted-17.1.0.tar.bz2 (3.0MB)  Collecting queuelib (from Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading queuelib-1.4.2-py2.py3-none-any.whl  Collecting six>=1.5.2 (from Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading six-1.10.0-py2.py3-none-any.whl  Collecting pyOpenSSL (from Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading pyOpenSSL-16.2.0-py2.py3-none-any.whl (43kB)  Collecting pyasn1-modules (from service-identity==14.0.0->-r requirements.txt (line 4))  Downloading pyasn1_modules-0.0.8-py2.py3-none-any.whl  Collecting characteristic>=14.0.0 (from service-identity==14.0.0->-r requirements.txt (line 4))  Downloading characteristic-14.3.0-py2.py3-none-any.whl  Collecting pyasn1 (from service-identity==14.0.0->-r requirements.txt (line 4))  Downloading pyasn1-0.2.3-py2.py3-none-any.whl (53kB)  Collecting Mako (from alembic->scraperwiki==0.4.1->-r requirements.txt (line 2))  Downloading Mako-1.0.6.tar.gz (575kB)  Collecting python-editor>=0.3 (from alembic->scraperwiki==0.4.1->-r requirements.txt (line 2))  Downloading python-editor-1.0.3.tar.gz  Collecting zope.interface>=3.6.0 (from Twisted>=10.0.0->Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading zope.interface-4.3.3.tar.gz (150kB)  Collecting constantly>=15.1 (from Twisted>=10.0.0->Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading constantly-15.1.0-py2.py3-none-any.whl  Collecting incremental>=16.10.1 (from Twisted>=10.0.0->Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading incremental-16.10.1-py2.py3-none-any.whl  Collecting Automat>=0.3.0 (from Twisted>=10.0.0->Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading Automat-0.5.0-py2.py3-none-any.whl  Collecting cryptography>=1.3.4 (from pyOpenSSL->Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading cryptography-1.7.2.tar.gz (420kB)  Collecting MarkupSafe>=0.9.2 (from Mako->alembic->scraperwiki==0.4.1->-r requirements.txt (line 2))  Downloading MarkupSafe-0.23.tar.gz  Collecting attrs (from Automat>=0.3.0->Twisted>=10.0.0->Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading attrs-16.3.0-py2.py3-none-any.whl  Collecting idna>=2.0 (from cryptography>=1.3.4->pyOpenSSL->Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading idna-2.3-py2.py3-none-any.whl (55kB)  Collecting enum34 (from cryptography>=1.3.4->pyOpenSSL->Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading enum34-1.1.6-py2-none-any.whl  Collecting ipaddress (from cryptography>=1.3.4->pyOpenSSL->Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading ipaddress-1.0.18-py2-none-any.whl  Collecting cffi>=1.4.1 (from cryptography>=1.3.4->pyOpenSSL->Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading cffi-1.9.1-cp27-cp27m-manylinux1_x86_64.whl (389kB)  Collecting pycparser (from cffi>=1.4.1->cryptography>=1.3.4->pyOpenSSL->Scrapy==0.24.6->-r requirements.txt (line 3))  Downloading pycparser-2.17.tar.gz (231kB)  Installing collected packages: lxml, requests, sqlalchemy, MarkupSafe, Mako, python-editor, alembic, scraperwiki, cssselect, six, w3lib, zope.interface, constantly, incremental, attrs, Automat, Twisted, queuelib, idna, pyasn1, enum34, ipaddress, pycparser, cffi, cryptography, pyOpenSSL, Scrapy, pyasn1-modules, characteristic, service-identity  Running setup.py install for lxml: started  Running setup.py install for lxml: still running...  Running setup.py install for lxml: finished with status 'done'  Running setup.py install for sqlalchemy: started  Running setup.py install for sqlalchemy: finished with status 'done'  Running setup.py install for MarkupSafe: started  Running setup.py install for MarkupSafe: finished with status 'done'  Running setup.py install for Mako: started  Running setup.py install for Mako: finished with status 'done'  Running setup.py install for python-editor: started  Running setup.py install for python-editor: finished with status 'done'  Running setup.py install for alembic: started  Running setup.py install for alembic: finished with status 'done'  Running setup.py install for scraperwiki: started  Running setup.py install for scraperwiki: finished with status 'done'  Running setup.py install for zope.interface: started  Running setup.py install for zope.interface: finished with status 'done'  Running setup.py install for Twisted: started  Running setup.py install for Twisted: finished with status 'done'  Running setup.py install for pycparser: started  Running setup.py install for pycparser: finished with status 'done'  Running setup.py install for cryptography: started  Running setup.py install for cryptography: finished with status 'done'  Successfully installed Automat-0.5.0 Mako-1.0.6 MarkupSafe-0.23 Scrapy-0.24.6 Twisted-17.1.0 alembic-0.9.0 attrs-16.3.0 cffi-1.9.1 characteristic-14.3.0 constantly-15.1.0 cryptography-1.7.2 cssselect-1.0.1 enum34-1.1.6 idna-2.3 incremental-16.10.1 ipaddress-1.0.18 lxml-3.4.4 pyOpenSSL-16.2.0 pyasn1-0.2.3 pyasn1-modules-0.0.8 pycparser-2.17 python-editor-1.0.3 queuelib-1.4.2 requests-2.13.0 scraperwiki-0.4.1 service-identity-14.0.0 six-1.10.0 sqlalchemy-1.1.6 w3lib-1.17.0 zope.interface-4.3.3   ! Hello! It looks like your application is using an outdated version of Python.  ! This caused the security warning you saw above during the 'pip install' step.  ! We recommend 'python-2.7.12', which you can specify in a 'runtime.txt' file.  ! -- Much Love, Heroku.   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... 2017-03-01 00:47:41+0000 [planningalerts] INFO: Searching for requests between 2017-03-01 and 2017-03-31 2017-03-01 00:47:47+0000 [planningalerts] INFO: Saved 55/2017 2017-03-01 00:47:47+0000 [planningalerts] INFO: Closing spider (finished) 2017-03-01 00:47:47+0000 [planningalerts] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 1566, 'downloader/request_count': 3, 'downloader/request_method_count/GET': 3, 'downloader/response_bytes': 62178, 'downloader/response_count': 3, 'downloader/response_status_count/200': 3, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2017, 3, 1, 0, 47, 47, 649764), 'item_scraped_count': 1, 'memdebug/gc_garbage_count': 0, 'memdebug/live_refs/PlanningalertsSpider': 1, 'memusage/max': 51879936, 'memusage/startup': 51879936, 'request_depth_max': 2, 'response_received_count': 3, 'scheduler/dequeued': 3, 'scheduler/dequeued/memory': 3, 'scheduler/enqueued': 3, 'scheduler/enqueued/memory': 3, 'start_time': datetime.datetime(2017, 3, 1, 0, 47, 39, 472555)} 2017-03-01 00:47:47+0000 [planningalerts] INFO: Spider closed (finished)

Data

Downloaded 2 times by MikeRalphson henare

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (41 KB) Use the API

rows 10 / 106

info_url external_reference comment_url date_received description date_scraped address council_reference
PR19429
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20188/2015
2015-06-09
Dwelling and Attached Garage
2015-06-14
24 Nicholls Lane HUNTLEY NSW 2800
188/2015
PR20497
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20501/2007
2015-06-09
Housing for Seniors or People With a Disability (including a community centre, an indoor pool and a bowling green)
2015-06-14
109 Ploughmans Lane ORANGE NSW 2800
501/2007
PR8762
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20192/2015
2015-06-10
Subdivision (two lot residential)
2015-06-14
152 Moulder Street ORANGE NSW 280080 Wade Place ORANGE NSW 2800
192/2015
PR11066
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20186/2015
2015-06-03
Dwelling Alterations & Additions (carport)
2015-06-14
106 Sieben Drive ORANGE NSW 2800
186/2015
PR26881
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20182/2015
2015-06-02
Dwelling and Attached Garage
2015-06-14
19 Taloumbi Place ORANGE NSW 2800
182/2015
PR7592
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20193/2015
2015-06-12
Garages/Outbuildings
2015-06-14
96 March Street ORANGE NSW 2800
193/2015
PR26767
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20194/2015
2015-06-12
Garages/Outbuildings
2015-06-14
62 Valencia Drive ORANGE NSW 2800
194/2015
PR12317
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20187/2015
2015-06-04
Shed
2015-06-14
19 Wallace Lane CANOBOLAS NSW 2800
187/2015
PR26705
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20183/2015
2015-06-03
Dwelling and Attached Garage
2015-06-14
11 Japonica Place ORANGE NSW 2800
183/2015
PR26739
mailto:council@orange.nsw.gov.au?subject=Development%20Application%20Enquiry%3A%20191/2015
2015-06-10
Subdivision (two lot residential) and Dwelling Houses (two)
2015-06-14
29 Dimboola Way ORANGE NSW 2800
191/2015

Statistics

Average successful run time: 2 minutes

Total run time: 21 minutes

Total cpu time used: half a minute

Total disk space used: 73.1 KB

History

  • Manually ran revision ba090094 and completed successfully .
    nothing changed in the database
  • Manually ran revision ba090094 and completed successfully .
    17 records added in the database
    17 pages scraped
  • Manually ran revision ba090094 and completed successfully .
    nothing changed in the database
    1 page scraped
  • Manually ran revision ba090094 and completed successfully .
    30 records added in the database
  • Manually ran revision ba090094 and completed successfully .
    3 records added in the database
    41 pages scraped
  • Manually ran revision ba090094 and completed successfully .
    nothing changed in the database
    38 pages scraped
  • Manually ran revision ba090094 and completed successfully .
    36 records added in the database
    38 pages scraped
  • Auto ran revision ba090094 and completed successfully .
    1 record added in the database
    22 pages scraped
  • Manually ran revision ba090094 and completed successfully .
    19 records added in the database
    21 pages scraped
  • Auto ran revision 1af813d3 and completed successfully .
    nothing changed in the database
    21 pages scraped
  • Manually ran revision 1af813d3 and completed successfully .
    19 records added in the database
    21 pages scraped
  • Created on morph.io

Scraper code

Python

orange_city_council / scraper.py