aytasrele / joblists

job lists


This is a scraper that runs on Morph. To get started see the documentation

Contributors aytasrele

Last run completed successfully .

Console output of last run

Injecting configuration and compiling... -----> Python app detected -----> Stack changed, re-installing runtime -----> Installing runtime (python-2.7.9) -----> Installing dependencies with pip  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r requirements.txt (line 6))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to morph_defaults) to ./.heroku/src/scraperwiki  Collecting lxml==3.4.4 (from -r requirements.txt (line 8))  Downloading lxml-3.4.4.tar.gz (3.5MB)  Building lxml version 3.4.4.  Building without Cython.  Using build configuration of libxslt 1.1.28  /app/.heroku/python/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url'  warnings.warn(msg)  Collecting cssselect==0.9.1 (from -r requirements.txt (line 9))  Downloading cssselect-0.9.1.tar.gz  Collecting dumptruck>=0.1.2 (from scraperwiki->-r requirements.txt (line 6))  Downloading dumptruck-0.1.6.tar.gz  Collecting requests (from scraperwiki->-r requirements.txt (line 6))  Downloading requests-2.7.0-py2.py3-none-any.whl (470kB)  Installing collected packages: requests, dumptruck, cssselect, lxml, scraperwiki   Running setup.py install for dumptruck  Running setup.py install for cssselect  Running setup.py install for lxml  Building lxml version 3.4.4.  Building without Cython.  Using build configuration of libxslt 1.1.28  /app/.heroku/python/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url'  warnings.warn(msg)  building 'lxml.etree' extension  gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/usr/include/libxml2 -I/tmp/pip-build-YLLrfY/lxml/src/lxml/includes -I/app/.heroku/python/include/python2.7 -c src/lxml/lxml.etree.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -w  gcc -pthread -shared build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -lxslt -lexslt -lxml2 -lz -lm -o build/lib.linux-x86_64-2.7/lxml/etree.so  building 'lxml.objectify' extension  gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/usr/include/libxml2 -I/tmp/pip-build-YLLrfY/lxml/src/lxml/includes -I/app/.heroku/python/include/python2.7 -c src/lxml/lxml.objectify.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.objectify.o -w  gcc -pthread -shared build/temp.linux-x86_64-2.7/src/lxml/lxml.objectify.o -lxslt -lexslt -lxml2 -lz -lm -o build/lib.linux-x86_64-2.7/lxml/objectify.so  Running setup.py develop for scraperwiki  Creating /app/.heroku/python/lib/python2.7/site-packages/scraperwiki.egg-link (link to .)  Adding scraperwiki 0.3.7 to easy-install.pth file  Installed /app/.heroku/src/scraperwiki  Successfully installed cssselect-0.9.1 dumptruck-0.1.6 lxml-3.4.4 requests-2.7.0 scraperwiki  -----> Discovering process types  Procfile declares types -> scraper Injecting scraper code and running...

Statistics

Average successful run time: 2 minutes

Total run time: 2 minutes

Total cpu time used: less than 5 seconds

Total disk space used: 18.8 KB

History

  • Manually ran revision 556ec806 and completed successfully .
    nothing changed in the database
  • Created on morph.io

Scraper code

Python

joblists / scraper.py