andyswarby / facebook

Facebook


Forked from ScraperWiki

Contributors andyswarby

Last run failed with status code 1.

Console output of last run

Morph internal error: read timeout reached Stopping current container and requeueing Traceback (most recent call last): File "/repo/scraper.py", line 18, in <module> results_json = simplejson.loads(scraperwiki.scrape(profile_url)) File "/usr/local/lib/python2.7/dist-packages/scraperwiki-0.3.7-py2.7.egg/scraperwiki/utils.py", line 31, in scrape f = urllib2.urlopen(req) File "/usr/lib/python2.7/urllib2.py", line 126, in urlopen return _opener.open(url, data, timeout) File "/usr/lib/python2.7/urllib2.py", line 400, in open response = self._open(req, data) File "/usr/lib/python2.7/urllib2.py", line 418, in _open '_open', req) File "/usr/lib/python2.7/urllib2.py", line 378, in _call_chain result = func(*args) File "/usr/lib/python2.7/urllib2.py", line 1207, in http_open return self.do_open(httplib.HTTPConnection, req) File "/usr/lib/python2.7/urllib2.py", line 1177, in do_open raise URLError(err) urllib2.URLError: <urlopen error [Errno 101] Network is unreachable>

Data

Downloaded 1 time by MikeRalphson

To download data sign in with GitHub

Download table (as CSV) Download SQLite database (243 KB) Use the API

rows 10 / 2311

username first_name last_name name locale gender link id
kayode.ogunro
Kayode
Ogunro
Kayode Ogunro
en_US
male
362
conniez
Connie
Zong
Connie Zong
en_US
female
363
benet.magnuson
Benet
Magnuson
Benet Magnuson
en_US
male
364
yan.zhao
Yan
Zhao
Yan Zhao
en_US
female
365
miscellena
Ellen
Ching
Ellen Ching
en_US
female
366
lolandra
Lola
Ajilore
Lola Ajilore
en_US
female
367
felipe.tewes
Felipe
Tewes
Felipe Tewes
en_US
male
368
emily.riehl.9
Emily
Riehl
Emily Riehl
en_US
female
369
katherine.gelber
Katie
Gelber
Katie Gelber
en_US
female
370
willjadams
Will
Adams
Will Adams
en_US
371

Statistics

Total run time: 25 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 261 KB

History

  • Manually ran revision 6dfecfef and failed .
    226 records added, 226 records removed in the database
  • Forked from ScraperWiki

Scraper code

Python

facebook / scraper.py