mhl / usa-legislators

Politicians in the House of Representatives and Senate of the United States for America

Scrapes raw.githubusercontent.com

Build software better, together.


Contributors mhl JoshData

Last run failed with status code 137.

Console output of last run

Injecting configuration and compiling... [1G-----> Python app detected [1G-----> Stack changed, re-installing runtime [1G-----> Installing runtime (python-2.7.9) [1G-----> Installing dependencies with pip [1G Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r requirements.txt (line 6)) [1G Cloning http://github.com/openaustralia/scraperwiki-python.git (to morph_defaults) to ./.heroku/src/scraperwiki [1G Collecting lxml==3.4.4 (from -r requirements.txt (line 8)) [1G Downloading lxml-3.4.4.tar.gz (3.5MB) [1G Building lxml version 3.4.4. [1G Building without Cython. [1G Using build configuration of libxslt 1.1.28 [1G /app/.heroku/python/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url' [1G warnings.warn(msg) [1G Collecting cssselect==0.9.1 (from -r requirements.txt (line 9)) [1G Downloading cssselect-0.9.1.tar.gz [1G Collecting requests==2.7.0 (from -r requirements.txt (line 10)) [1G Downloading requests-2.7.0-py2.py3-none-any.whl (470kB) [1G Collecting rtyaml==0.0.2 (from -r requirements.txt (line 11)) [1G Downloading rtyaml-0.0.2.tar.gz [1G Collecting dumptruck>=0.1.2 (from scraperwiki->-r requirements.txt (line 6)) [1G Downloading dumptruck-0.1.6.tar.gz [1G Collecting pyyaml (from rtyaml==0.0.2->-r requirements.txt (line 11)) [1G Downloading PyYAML-3.11.tar.gz (248kB) [1G Installing collected packages: pyyaml, dumptruck, rtyaml, requests, cssselect, lxml, scraperwiki [1G Running setup.py install for pyyaml [1G checking if libyaml is compilable [1G gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/app/.heroku/python/include/python2.7 -c build/temp.linux-x86_64-2.7/check_libyaml.c -o build/temp.linux-x86_64-2.7/check_libyaml.o [1G build/temp.linux-x86_64-2.7/check_libyaml.c:2:18: fatal error: yaml.h: No such file or directory [1G #include [1G ^ [1G compilation terminated. [1G libyaml is not found or a compiler error: forcing --without-libyaml [1G (if libyaml is installed correctly, you may need to [1G specify the option --include-dirs or uncomment and [1G modify the parameter include_dirs in setup.cfg) [1G Running setup.py install for dumptruck [1G Running setup.py install for rtyaml [1G [1G Running setup.py install for cssselect [1G Running setup.py install for lxml [1G Building lxml version 3.4.4. [1G Building without Cython. [1G Using build configuration of libxslt 1.1.28 [1G /app/.heroku/python/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url' [1G warnings.warn(msg) [1G building 'lxml.etree' extension [1G gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/usr/include/libxml2 -I/tmp/pip-build-fdcHTo/lxml/src/lxml/includes -I/app/.heroku/python/include/python2.7 -c src/lxml/lxml.etree.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -w [1G gcc -pthread -shared build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -lxslt -lexslt -lxml2 -lz -lm -o build/lib.linux-x86_64-2.7/lxml/etree.so [1G building 'lxml.objectify' extension [1G gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/usr/include/libxml2 -I/tmp/pip-build-fdcHTo/lxml/src/lxml/includes -I/app/.heroku/python/include/python2.7 -c src/lxml/lxml.objectify.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.objectify.o -w [1G gcc -pthread -shared build/temp.linux-x86_64-2.7/src/lxml/lxml.objectify.o -lxslt -lexslt -lxml2 -lz -lm -o build/lib.linux-x86_64-2.7/lxml/objectify.so [1G Running setup.py develop for scraperwiki [1G Creating /app/.heroku/python/lib/python2.7/site-packages/scraperwiki.egg-link (link to .) [1G Adding scraperwiki 0.3.7 to easy-install.pth file [1G Installed /app/.heroku/src/scraperwiki [1G Successfully installed cssselect-0.9.1 dumptruck-0.1.6 lxml-3.4.4 pyyaml-3.11 requests-2.7.0 rtyaml-0.0.2 scraperwiki [1G [1G-----> Discovering process types [1G Procfile declares types -> scraper Injecting scraper and running... /start: line 24: 12 Killed setuidgid scraper $(eval echo ${command})

Statistics

Total run time: 2 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 36.1 KB

History

  • Manually ran revision 8fc5517a and failed .
    nothing changed in the database
    1 page scraped
  • Created on morph.io

Scraper code

usa-legislators