This is a scraper that runs on Morph. To get started see the documentation

Contributors krynens

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected -----> Installing python-3.6.2 -----> Installing pip -----> Installing requirements with pip  Collecting alembic==1.5.5  Downloading alembic-1.5.5-py2.py3-none-any.whl (156 kB)  Collecting beautifulsoup4==4.9.3  Downloading beautifulsoup4-4.9.3-py3-none-any.whl (115 kB)  Collecting bs4==0.0.1  Downloading bs4-0.0.1.tar.gz (1.1 kB)  Preparing metadata (setup.py): started  Preparing metadata (setup.py): finished with status 'done'  Collecting certifi==2020.12.5  Downloading certifi-2020.12.5-py2.py3-none-any.whl (147 kB)  Collecting chardet==4.0.0  Downloading chardet-4.0.0-py2.py3-none-any.whl (178 kB)  Collecting idna==2.10  Downloading idna-2.10-py2.py3-none-any.whl (58 kB)  Collecting lxml==4.6.2  Downloading lxml-4.6.2-cp36-cp36m-manylinux1_x86_64.whl (5.5 MB)  Collecting Mako==1.1.4  Downloading Mako-1.1.4-py2.py3-none-any.whl (75 kB)  Collecting MarkupSafe==1.1.1  Downloading MarkupSafe-1.1.1-cp36-cp36m-manylinux2010_x86_64.whl (32 kB)  Collecting python-dateutil==2.8.1  Downloading python_dateutil-2.8.1-py2.py3-none-any.whl (227 kB)  Collecting python-editor==1.0.4  Downloading python_editor-1.0.4-py3-none-any.whl (4.9 kB)  Collecting requests==2.25.1  Downloading requests-2.25.1-py2.py3-none-any.whl (61 kB)  Collecting scraperwiki==0.5.1  Downloading scraperwiki-0.5.1.tar.gz (7.7 kB)  Preparing metadata (setup.py): started  Preparing metadata (setup.py): finished with status 'done'  Collecting six==1.15.0  Downloading six-1.15.0-py2.py3-none-any.whl (10 kB)  Collecting soupsieve==2.2  Downloading soupsieve-2.2-py3-none-any.whl (33 kB)  Collecting SQLAlchemy==1.3.23  Downloading SQLAlchemy-1.3.23-cp36-cp36m-manylinux2010_x86_64.whl (1.3 MB)  Collecting urllib3==1.26.3  Downloading urllib3-1.26.3-py2.py3-none-any.whl (137 kB)  Building wheels for collected packages: bs4, scraperwiki  Building wheel for bs4 (setup.py): started  Building wheel for bs4 (setup.py): finished with status 'done'  Created wheel for bs4: filename=bs4-0.0.1-py3-none-any.whl size=1272 sha256=de558931e24d49730f7529ffecaee61866a39a366838d2bf453599844be0598b  Stored in directory: /tmp/pip-ephem-wheel-cache-oxv3pj6a/wheels/19/f5/6d/a97dd4f22376d4472d5f4c76c7646876052ff3166b3cf71050  Building wheel for scraperwiki (setup.py): started  Building wheel for scraperwiki (setup.py): finished with status 'done'  Created wheel for scraperwiki: filename=scraperwiki-0.5.1-py3-none-any.whl size=6545 sha256=03e27bd14244aaa98fbbe182d4723f7a7b9eecf9b7ee05b20d3276da439b7ea4  Stored in directory: /tmp/pip-ephem-wheel-cache-oxv3pj6a/wheels/cd/f8/ac/cd66eb1c557ab40d35c1ed852da3e9b37baa3e21b61906a5cf  Successfully built bs4 scraperwiki  Installing collected packages: six, MarkupSafe, urllib3, SQLAlchemy, soupsieve, python-editor, python-dateutil, Mako, idna, chardet, certifi, requests, beautifulsoup4, alembic, scraperwiki, lxml, bs4  Successfully installed Mako-1.1.4 MarkupSafe-1.1.1 SQLAlchemy-1.3.23 alembic-1.5.5 beautifulsoup4-4.9.3 bs4-0.0.1 certifi-2020.12.5 chardet-4.0.0 idna-2.10 lxml-4.6.2 python-dateutil-2.8.1 python-editor-1.0.4 requests-2.25.1 scraperwiki-0.5.1 six-1.15.0 soupsieve-2.2 urllib3-1.26.3   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... Getting page 1 Getting page 2 Getting page 3 Getting page 4 Getting page 5 Getting page 6 Scraper finished.

Data

Downloaded 0 times

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (252 KB) Use the API

rows 10 / 382

address date_received date_scraped description council_reference info_url
State Netball Hockey Centre 10 Brens Drive Parkville 3052
2021-01-13
2021-03-11
Buildings and works to an existing telecommunications facility
TP-2021-13
Rear 15 Union Street North Melbourne 3051
2020-12-04
2021-03-11
Use of the land as a Restricted Retail Premises (bicycle sales), construction and display of business identification signage and waiver of the car parking requirement
TP-2020-774
403-405 Lygon Street Carlton 3053
2020-11-13
2021-03-18
Partial demolition, external alterations, and buildings and works to the existing dwelling
TP-2020-731
296-304 Macaulay Road North Melbourne 3051
2020-09-22
2021-03-18
Use of Unit 2 and 5 for the purpose of Education Centres and reduction of the statutory car parking and bicycle facilities requirements
TP-2020-622
18 Wolseley Parade Kensington 3031
2020-08-05
2021-03-18
Partial demolition and the construction of buildings and works to construct a double storey addition to the existing dwelling and rear garage
TP-2020-527
137-157 Adderley Street West Melbourne 3003
2020-06-25
2021-03-18
Amend the permit preamble to include a supermarket and retail tenancy use and some changes to the permit conditions, also amend the plans in accordance with the statement of changes prepared by Buchan
TP-2017-395/A
258 Lygon Street Carlton 3053
2021-01-19
2021-03-22
Sale and consumption of liquor (restaurant and cafe licence)
TP-2021-20
1-3 Rankins Lane Melbourne 3000
2021-02-15
2021-03-24
Change of use to Yoga and Pilates studio (restricted recreation facility)
TP-2021-80
140-142 Jolimont Road East Melbourne 3002
2020-11-13
2021-03-24
Partial demolition; alterations; buildings and works including the construction of a multi storey addition; reduction in the car parking requirement (Food and Drink Premises); exceedance of the maximu
TP-2020-736
226 Clarendon Street East Melbourne 3002
2020-10-22
2021-03-24
Partial demolition, alterations and additions, and the construction and display of signage in a Heritage Overlay; together with a reduction in the car parking and bicycle facilities required associate
TP-2020-688

Statistics

Average successful run time: 1 minute

Total run time: about 10 hours

Total cpu time used: about 2 hours

Total disk space used: 273 KB

History

  • Auto ran revision f5ceac7f and completed successfully .
    39 records added, 39 records removed in the database
  • Auto ran revision f5ceac7f and completed successfully .
    43 records added, 42 records removed in the database
  • Auto ran revision f5ceac7f and completed successfully .
    42 records added, 38 records removed in the database
  • Auto ran revision f5ceac7f and completed successfully .
    41 records added, 41 records removed in the database
  • Auto ran revision f5ceac7f and completed successfully .
    41 records added, 41 records removed in the database
  • ...
  • Created on morph.io

Show complete history

Scraper code

Python

melbourne_city_scraper / scraper.py