nickoneill / us-governors

US Governors Contact Info

Scrapes www.nga.org

Home


This is a scraper that can run on Morph.

You can also run it locally with python scraper.py after installing pip requirements.txt

Contributors jlev

Last run completed successfully .

Console output of last run

Injecting configuration and compiling...  -----> Python app detected  ! The latest version of Python 2 is python-2.7.14 (you are using python-2.7.9, which is unsupported).  ! We recommend upgrading by specifying the latest version (python-2.7.14).  Learn More: https://devcenter.heroku.com/articles/python-runtimes -----> Installing python-2.7.9 -----> Installing pip -----> Installing requirements with pip  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r /tmp/build/requirements.txt (line 6))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to revision morph_defaults) to /app/.heroku/src/scraperwiki  Switched to a new branch 'morph_defaults'  Branch morph_defaults set up to track remote branch morph_defaults from origin.  Collecting beautifulsoup4 (from -r /tmp/build/requirements.txt (line 7))  Downloading https://files.pythonhosted.org/packages/8b/0e/048a2f88bc4be5e3697df9dc1f7b9d5c9c75be62676feeeb91d2e896c5ea/beautifulsoup4-4.7.1-py2-none-any.whl (94kB)  Collecting html5lib (from -r /tmp/build/requirements.txt (line 8))  Downloading https://files.pythonhosted.org/packages/a5/62/bbd2be0e7943ec8504b517e62bab011b4946e1258842bc159e5dfde15b96/html5lib-1.0.1-py2.py3-none-any.whl (117kB)  Collecting dumptruck>=0.1.2 (from scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/15/27/3330a343de80d6849545b6c7723f8c9a08b4b104de964ac366e7e6b318df/dumptruck-0.1.6.tar.gz  Collecting requests (from scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl (57kB)  Collecting soupsieve>=1.2 (from beautifulsoup4->-r /tmp/build/requirements.txt (line 7))  Downloading https://files.pythonhosted.org/packages/ef/06/53edcae4edea76b38a325980dd35aed3b39f9bd0ef27b9d33f2e6dc4c7f6/soupsieve-1.6.2-py2.py3-none-any.whl  Collecting six>=1.9 (from html5lib->-r /tmp/build/requirements.txt (line 8))  Downloading https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl  Collecting webencodings (from html5lib->-r /tmp/build/requirements.txt (line 8))  Downloading https://files.pythonhosted.org/packages/f4/24/2a3e3df732393fed8b3ebf2ec078f05546de641fe1b667ee316ec1dcf3b7/webencodings-0.5.1-py2.py3-none-any.whl  Collecting urllib3<1.25,>=1.21.1 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/62/00/ee1d7de624db8ba7090d1226aebefab96a2c71cd5cfa7629d6ad3f61b79e/urllib3-1.24.1-py2.py3-none-any.whl (118kB)  Collecting certifi>=2017.4.17 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/9f/e0/accfc1b56b57e9750eba272e24c4dddeac86852c2bebd1236674d7887e8a/certifi-2018.11.29-py2.py3-none-any.whl (154kB)  Collecting idna<2.9,>=2.5 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl (58kB)  Collecting chardet<3.1.0,>=3.0.2 (from requests->scraperwiki->-r /tmp/build/requirements.txt (line 6))  Downloading https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133kB)  Collecting backports.functools-lru-cache; python_version < "3" (from soupsieve>=1.2->beautifulsoup4->-r /tmp/build/requirements.txt (line 7))  Downloading https://files.pythonhosted.org/packages/03/8e/2424c0e65c4a066e28f539364deee49b6451f8fcd4f718fefa50cc3dcf48/backports.functools_lru_cache-1.5-py2.py3-none-any.whl  Installing collected packages: dumptruck, urllib3, certifi, idna, chardet, requests, scraperwiki, backports.functools-lru-cache, soupsieve, beautifulsoup4, six, webencodings, html5lib  Running setup.py install for dumptruck: started  Running setup.py install for dumptruck: finished with status 'done'  Running setup.py develop for scraperwiki  Successfully installed backports.functools-lru-cache-1.5 beautifulsoup4-4.7.1 certifi-2018.11.29 chardet-3.0.4 dumptruck-0.1.6 html5lib-1.0.1 idna-2.8 requests-2.21.0 scraperwiki six-1.12.0 soupsieve-1.6.2 urllib3-1.24.1 webencodings-0.5.1   -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running...

Data

Downloaded 1 time by nickoneill

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (11 KB) Use the API

rows 10 / 55

state_name first_name address2 address1 phone url city last_name zip state_abbr fax
Alabama
Kay
600 Dexter Avenue
State Capitol
334-242-7100
Montgomery
Ivey
36130-2751
AL
334-353-0004
Alaska
Mike
P.O. Box 110001
State Capitol
907-465-3500
Juneau
Dunleavy
99811-0001
AK
907-465-3532
American Samoa
Lolo Matalasi
Third Floor
Executive Office Building
011-684-633-4116
Pago Pago
Moliga
96799
AS
011-684-633-2269
Arizona
Doug
1700 West Washington
State Capitol
602-542-4331
Phoenix
Ducey
85007
AZ
602-542-7601
Arkansas
Asa
Room 250
State Capitol
501-682-2345
Little Rock
Hutchinson
72201
AR
501-682-1382
California
Gavin
Suite 1173
State Capitol
916-445-2841
Sacramento
Newsom
95814
CA
916-558-3160
Colorado
Jared
136 State Capitol
303-866-2471
Denver
Polis
80203-1792
CO
303-866-2003
Connecticut
Ned
210 Capitol Avenue
800-406-1527
Hartford
Lamont
06106
CT
860-524-7395
Delaware
John
Legislative Hall
302-744-4101
Dover
Carney
19901
DE
302-739-2775
Florida
Ron
400 South Monroe Street
PL 05 The Capitol
850-488-7146
Tallahassee
DeSantis
32399-0001
FL
850-487-0801

Statistics

Average successful run time: less than a minute

Total run time: less than a minute

Total cpu time used: less than 5 seconds

Total disk space used: 92.5 KB

History

  • Manually ran revision 9f055cac and completed successfully .
    55 records added in the database
    2 pages scraped
  • Created on morph.io

Scraper code

Python

us-governors / scraper.py