planningalerts-scrapers / cairns

Cairns Regional Council Development Applications

Scrapes eservices.cairns.qld.gov.au


Development applications for PlanningAlerts.

Contributors otherchirps

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected  ! The latest version of Python 2 is python-2.7.14 (you are using python-2.7.6, which is unsupported).  ! We recommend upgrading by specifying the latest version (python-2.7.14).  Learn More: https://devcenter.heroku.com/articles/python-runtimes -----> Installing python-2.7.6 -----> Installing pip -----> Installing requirements with pip  Collecting lxml==3.4.4 (from -r /tmp/build/requirements.txt (line 1))  /app/.heroku/python/lib/python2.7/site-packages/pip/_vendor/requests/packages/urllib3/util/ssl_.py:318: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/security.html#snimissingwarning.  SNIMissingWarning  /app/.heroku/python/lib/python2.7/site-packages/pip/_vendor/requests/packages/urllib3/util/ssl_.py:122: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/security.html#insecureplatformwarning.  InsecurePlatformWarning  Downloading lxml-3.4.4.tar.gz (3.5MB)  Collecting scraperwiki==0.4.1 (from -r /tmp/build/requirements.txt (line 2))  Downloading scraperwiki-0.4.1.tar.gz  Collecting Scrapy==0.24.6 (from -r /tmp/build/requirements.txt (line 3))  Downloading Scrapy-0.24.6-py2-none-any.whl (444kB)  Collecting service-identity==14.0.0 (from -r /tmp/build/requirements.txt (line 4))  Downloading service_identity-14.0.0-py2.py3-none-any.whl  Collecting requests (from scraperwiki==0.4.1->-r /tmp/build/requirements.txt (line 2))  Downloading requests-2.18.4-py2.py3-none-any.whl (88kB)  Collecting sqlalchemy (from scraperwiki==0.4.1->-r /tmp/build/requirements.txt (line 2))  Downloading SQLAlchemy-1.2.1.tar.gz (5.5MB)  Collecting alembic (from scraperwiki==0.4.1->-r /tmp/build/requirements.txt (line 2))  Downloading alembic-0.9.7.tar.gz (1.0MB)  Collecting pyOpenSSL (from Scrapy==0.24.6->-r /tmp/build/requirements.txt (line 3))  Downloading pyOpenSSL-17.5.0-py2.py3-none-any.whl (53kB)  Collecting w3lib>=1.8.0 (from Scrapy==0.24.6->-r /tmp/build/requirements.txt (line 3))  Downloading w3lib-1.18.0-py2.py3-none-any.whl  Collecting Twisted>=10.0.0 (from Scrapy==0.24.6->-r /tmp/build/requirements.txt (line 3))  Downloading Twisted-17.9.0.tar.bz2 (3.0MB)  Collecting queuelib (from Scrapy==0.24.6->-r /tmp/build/requirements.txt (line 3))  Downloading queuelib-1.4.2-py2.py3-none-any.whl  Collecting six>=1.5.2 (from Scrapy==0.24.6->-r /tmp/build/requirements.txt (line 3))  Downloading six-1.11.0-py2.py3-none-any.whl  Collecting cssselect>=0.9 (from Scrapy==0.24.6->-r /tmp/build/requirements.txt (line 3))  Downloading cssselect-1.0.3-py2.py3-none-any.whl  Collecting pyasn1-modules (from service-identity==14.0.0->-r /tmp/build/requirements.txt (line 4))  Downloading pyasn1_modules-0.2.1-py2.py3-none-any.whl (60kB)  Collecting characteristic>=14.0.0 (from service-identity==14.0.0->-r /tmp/build/requirements.txt (line 4))  Downloading characteristic-14.3.0-py2.py3-none-any.whl  Collecting pyasn1 (from service-identity==14.0.0->-r /tmp/build/requirements.txt (line 4))  Downloading pyasn1-0.4.2-py2.py3-none-any.whl (71kB)  Collecting chardet<3.1.0,>=3.0.2 (from requests->scraperwiki==0.4.1->-r /tmp/build/requirements.txt (line 2))  Downloading chardet-3.0.4-py2.py3-none-any.whl (133kB)  Collecting certifi>=2017.4.17 (from requests->scraperwiki==0.4.1->-r /tmp/build/requirements.txt (line 2))  Downloading certifi-2018.1.18-py2.py3-none-any.whl (151kB)  Collecting urllib3<1.23,>=1.21.1 (from requests->scraperwiki==0.4.1->-r /tmp/build/requirements.txt (line 2))  Downloading urllib3-1.22-py2.py3-none-any.whl (132kB)  Collecting idna<2.7,>=2.5 (from requests->scraperwiki==0.4.1->-r /tmp/build/requirements.txt (line 2))  Downloading idna-2.6-py2.py3-none-any.whl (56kB)  Collecting Mako (from alembic->scraperwiki==0.4.1->-r /tmp/build/requirements.txt (line 2))  Downloading Mako-1.0.7.tar.gz (564kB)  Collecting python-editor>=0.3 (from alembic->scraperwiki==0.4.1->-r /tmp/build/requirements.txt (line 2))  Downloading python-editor-1.0.3.tar.gz  Collecting python-dateutil (from alembic->scraperwiki==0.4.1->-r /tmp/build/requirements.txt (line 2))  Downloading python_dateutil-2.6.1-py2.py3-none-any.whl (194kB)  Collecting cryptography>=2.1.4 (from pyOpenSSL->Scrapy==0.24.6->-r /tmp/build/requirements.txt (line 3))  Downloading cryptography-2.1.4-cp27-cp27m-manylinux1_x86_64.whl (2.2MB)  Collecting zope.interface>=3.6.0 (from Twisted>=10.0.0->Scrapy==0.24.6->-r /tmp/build/requirements.txt (line 3))  Downloading zope.interface-4.4.3-cp27-cp27m-manylinux1_x86_64.whl (170kB)  Collecting constantly>=15.1 (from Twisted>=10.0.0->Scrapy==0.24.6->-r /tmp/build/requirements.txt (line 3))  Downloading constantly-15.1.0-py2.py3-none-any.whl  Collecting incremental>=16.10.1 (from Twisted>=10.0.0->Scrapy==0.24.6->-r /tmp/build/requirements.txt (line 3))  Downloading incremental-17.5.0-py2.py3-none-any.whl  Collecting Automat>=0.3.0 (from Twisted>=10.0.0->Scrapy==0.24.6->-r /tmp/build/requirements.txt (line 3))  Downloading Automat-0.6.0-py2.py3-none-any.whl  Collecting hyperlink>=17.1.1 (from Twisted>=10.0.0->Scrapy==0.24.6->-r /tmp/build/requirements.txt (line 3))  Downloading hyperlink-17.3.1-py2.py3-none-any.whl (73kB)  Collecting MarkupSafe>=0.9.2 (from Mako->alembic->scraperwiki==0.4.1->-r /tmp/build/requirements.txt (line 2))  Downloading MarkupSafe-1.0.tar.gz  Collecting cffi>=1.7; platform_python_implementation != "PyPy" (from cryptography>=2.1.4->pyOpenSSL->Scrapy==0.24.6->-r /tmp/build/requirements.txt (line 3))  Downloading cffi-1.11.4-cp27-cp27m-manylinux1_x86_64.whl (407kB)  Collecting enum34; python_version < "3" (from cryptography>=2.1.4->pyOpenSSL->Scrapy==0.24.6->-r /tmp/build/requirements.txt (line 3))  Downloading enum34-1.1.6-py2-none-any.whl  Collecting ipaddress; python_version < "3" (from cryptography>=2.1.4->pyOpenSSL->Scrapy==0.24.6->-r /tmp/build/requirements.txt (line 3))  Downloading ipaddress-1.0.19.tar.gz  Collecting asn1crypto>=0.21.0 (from cryptography>=2.1.4->pyOpenSSL->Scrapy==0.24.6->-r /tmp/build/requirements.txt (line 3))  Downloading asn1crypto-0.24.0-py2.py3-none-any.whl (101kB)  Collecting attrs (from Automat>=0.3.0->Twisted>=10.0.0->Scrapy==0.24.6->-r /tmp/build/requirements.txt (line 3))  Downloading attrs-17.4.0-py2.py3-none-any.whl  Collecting pycparser (from cffi>=1.7; platform_python_implementation != "PyPy"->cryptography>=2.1.4->pyOpenSSL->Scrapy==0.24.6->-r /tmp/build/requirements.txt (line 3))  Downloading pycparser-2.18.tar.gz (245kB)  Installing collected packages: lxml, chardet, certifi, urllib3, idna, requests, sqlalchemy, MarkupSafe, Mako, python-editor, six, python-dateutil, alembic, scraperwiki, pycparser, cffi, enum34, ipaddress, asn1crypto, cryptography, pyOpenSSL, w3lib, zope.interface, constantly, incremental, attrs, Automat, hyperlink, Twisted, queuelib, cssselect, Scrapy, pyasn1, pyasn1-modules, characteristic, service-identity  Running setup.py install for lxml: started  Running setup.py install for lxml: still running...  Running setup.py install for lxml: finished with status 'done'  Running setup.py install for sqlalchemy: started  Running setup.py install for sqlalchemy: finished with status 'done'  Running setup.py install for MarkupSafe: started  Running setup.py install for MarkupSafe: finished with status 'done'  Running setup.py install for Mako: started  Running setup.py install for Mako: finished with status 'done'  Running setup.py install for python-editor: started  Running setup.py install for python-editor: finished with status 'done'  Running setup.py install for alembic: started  Running setup.py install for alembic: finished with status 'done'  Running setup.py install for scraperwiki: started  Running setup.py install for scraperwiki: finished with status 'done'  Running setup.py install for pycparser: started  Running setup.py install for pycparser: finished with status 'done'  Running setup.py install for ipaddress: started  Running setup.py install for ipaddress: finished with status 'done'  Running setup.py install for Twisted: started  Running setup.py install for Twisted: finished with status 'done'  Successfully installed Automat-0.6.0 Mako-1.0.7 MarkupSafe-1.0 Scrapy-0.24.6 Twisted-17.9.0 alembic-0.9.7 asn1crypto-0.24.0 attrs-17.4.0 certifi-2018.1.18 cffi-1.11.4 characteristic-14.3.0 chardet-3.0.4 constantly-15.1.0 cryptography-2.1.4 cssselect-1.0.3 enum34-1.1.6 hyperlink-17.3.1 idna-2.6 incremental-17.5.0 ipaddress-1.0.19 lxml-3.4.4 pyOpenSSL-17.5.0 pyasn1-0.4.2 pyasn1-modules-0.2.1 pycparser-2.18 python-dateutil-2.6.1 python-editor-1.0.3 queuelib-1.4.2 requests-2.18.4 scraperwiki-0.4.1 service-identity-14.0.0 six-1.11.0 sqlalchemy-1.2.1 urllib3-1.22 w3lib-1.18.0 zope.interface-4.4.3   ! Hello! It looks like your application is using an outdated version of Python.  ! This caused the security warning you saw above during the 'pip install' step.  ! We recommend 'python-3.6.2', which you can specify in a 'runtime.txt' file.  ! -- Much Love, Heroku.   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... 2018-01-20 23:38:14+0000 [planningalerts] INFO: Searching for requests between 2018-01-01 and 2018-01-31 2018-01-20 23:38:18+0000 [planningalerts] INFO: Skipping previously saved application: 8339/2018 2018-01-20 23:38:18+0000 [planningalerts] INFO: Skipping previously saved application: 8352/2018 2018-01-20 23:38:18+0000 [planningalerts] INFO: Skipping previously saved application: 8353/2018 2018-01-20 23:38:18+0000 [planningalerts] INFO: Skipping previously saved application: 8348/2018 2018-01-20 23:38:19+0000 [planningalerts] INFO: Skipping previously saved application: 8343/2018 2018-01-20 23:38:19+0000 [planningalerts] INFO: Skipping previously saved application: 8340/2018 2018-01-20 23:38:19+0000 [planningalerts] INFO: Skipping previously saved application: 8345/2018 2018-01-20 23:38:19+0000 [planningalerts] INFO: Skipping previously saved application: 8338/2018 2018-01-20 23:38:20+0000 [planningalerts] INFO: Skipping previously saved application: 8342/2018 2018-01-20 23:38:20+0000 [planningalerts] INFO: Skipping previously saved application: 8344/2018 2018-01-20 23:38:20+0000 [planningalerts] INFO: Skipping previously saved application: 8347/2018 2018-01-20 23:38:20+0000 [planningalerts] INFO: Skipping previously saved application: 8351/2018 2018-01-20 23:38:20+0000 [planningalerts] INFO: Skipping previously saved application: 8349/2018 2018-01-20 23:38:21+0000 [planningalerts] INFO: Skipping previously saved application: 8341/2018 2018-01-20 23:38:21+0000 [planningalerts] INFO: Skipping previously saved application: 8350/2018 2018-01-20 23:38:21+0000 [planningalerts] INFO: Skipping previously saved application: 8346/2018 2018-01-20 23:38:21+0000 [planningalerts] INFO: Closing spider (finished) 2018-01-20 23:38:21+0000 [planningalerts] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 10317, 'downloader/request_count': 18, 'downloader/request_method_count/GET': 18, 'downloader/response_bytes': 106089, 'downloader/response_count': 18, 'downloader/response_status_count/200': 18, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2018, 1, 20, 23, 38, 21, 431969), 'item_scraped_count': 16, 'memdebug/gc_garbage_count': 0, 'memdebug/live_refs/PlanningalertsSpider': 1, 'memusage/max': 61243392, 'memusage/startup': 61243392, 'request_depth_max': 2, 'response_received_count': 18, 'scheduler/dequeued': 18, 'scheduler/dequeued/memory': 18, 'scheduler/enqueued': 18, 'scheduler/enqueued/memory': 18, 'start_time': datetime.datetime(2018, 1, 20, 23, 38, 10, 636624)} 2018-01-20 23:38:21+0000 [planningalerts] INFO: Spider closed (finished)

Data

Downloaded 2848 times by slav123 openaustralia LoveMyData TheZepto MikeRalphson peteclowes

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (624 KB) Use the API

rows 10 / 1599

address description comment_url date_received council_reference date_scraped external_reference info_url
318-324 Mulgrave Road WESTCOURT QLD 4870
CP-Indoor Sport & Entertainment
mailto:townplanner@cairns.qld.gov.au?subject=Development%20Application%20Enquiry%3A%206389/2015
2015-06-02
6389/2015
2015-06-19
8/7/3282 SEDA
13-15 Peakviews Close REDLYNCH QLD 4870
Material Change of Use (Code) House Extension (Shed)
mailto:townplanner@cairns.qld.gov.au?subject=Development%20Application%20Enquiry%3A%206402/2015
2015-06-11
6402/2015
2015-06-19
8/7/3290 SEDA
7 Mendi Close TRINITY BEACH QLD 4879
Reconfiguring a Lot 1 Lot into 2 Lots
mailto:townplanner@cairns.qld.gov.au?subject=Development%20Application%20Enquiry%3A%206409/2015
2015-06-12
6409/2015
2015-06-19
8/13/1901 SEDA
315 Pease Street EDGE HILL QLD 4870
Reconfiguring a Lot and Material Change or Use Reconfiguring a Lot, -Dual Occupancy and Business Facilities
mailto:townplanner@cairns.qld.gov.au?subject=Development%20Application%20Enquiry%3A%206399/2015
2015-06-10
6399/2015
2015-06-19
8/30/198 SEDA
35-37 Montalbion Avenue TRINITY PARK QLD 4879
Preliminary Approval for Building Work Assessable Against the Scheme Building Setbacks
mailto:townplanner@cairns.qld.gov.au?subject=Development%20Application%20Enquiry%3A%206407/2015
2015-06-12
6407/2015
2015-06-19
8/7/3292
900L Petersen Road EDMONTON QLD 4869
Operational Work Operational Works Mountainview Estate Stage 11 Bulk Earthworks
mailto:townplanner@cairns.qld.gov.au?subject=Development%20Application%20Enquiry%3A%206401/2015
2015-06-12
6401/2015
2015-06-19
8/10/344 SEDA
700L Bel-Air Drive WHITFIELD QLD 4870
Operational Work Operational Works Green Arrow public walking track construction Mt Whitfield
mailto:townplanner@cairns.qld.gov.au?subject=Development%20Application%20Enquiry%3A%206400/2015
2015-06-11
6400/2015
2015-06-19
8/10/342 SEDA
586 Mulgrave Road WOREE QLD 4868586 Mulgrave Road EARLVILLE QLD 4870
CP-Multi-Unit Housing (3 - 5 Units)
mailto:townplanner@cairns.qld.gov.au?subject=Development%20Application%20Enquiry%3A%206394/2015
2015-06-04
6394/2015
2015-06-19
8/7/3284
4L Mac Peak Crescent SMITHFIELD QLD 4878
Material Change of Use (Code Assessment) Industry Class A (Self Storage Shed)
mailto:townplanner@cairns.qld.gov.au?subject=Development%20Application%20Enquiry%3A%206388/2015
2015-06-01
6388/2015
2015-06-19
8/7/3280
37-51 Lyons Street PORTSMITH QLD 4870
Material Change of Use Environmentally Relevant Activity
mailto:townplanner@cairns.qld.gov.au?subject=Development%20Application%20Enquiry%3A%206387/2015
2015-06-01
6387/2015
2015-06-19
8/7/3281

Statistics

Average successful run time: 3 minutes

Total run time: 18 days

Total cpu time used: 44 minutes

Total disk space used: 658 KB

History

  • Auto ran revision 1db3f193 and completed successfully .
    nothing changed in the database
    18 pages scraped
  • Auto ran revision 1db3f193 and completed successfully .
    nothing changed in the database
  • Auto ran revision 1db3f193 and completed successfully .
    9 records added in the database
  • Auto ran revision 1db3f193 and failed .
    nothing changed in the database
  • Auto ran revision 1db3f193 and failed .
    nothing changed in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

Python

cairns / scraper.py