This is a scraper that runs on Morph. To get started see the documentation

Contributors g0rd

Last run completed successfully .

Console output of last run

Injecting configuration and compiling... -----> Python app detected -----> Stack changed, re-installing runtime -----> Installing runtime (python-2.7.9) -----> Installing dependencies with pip  Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r requirements.txt (line 6))  Cloning http://github.com/openaustralia/scraperwiki-python.git (to morph_defaults) to ./.heroku/src/scraperwiki  Collecting lxml==3.4.4 (from -r requirements.txt (line 8))  Downloading lxml-3.4.4.tar.gz (3.5MB)  Building lxml version 3.4.4.  Building without Cython.  Using build configuration of libxslt 1.1.28  /app/.heroku/python/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url'  warnings.warn(msg)  Collecting cssselect==0.9.1 (from -r requirements.txt (line 9))  Downloading cssselect-0.9.1.tar.gz  Collecting splinter>=0.7.3 (from -r requirements.txt (line 10))  Downloading splinter-0.7.3.tar.gz  Collecting scrapy==0.24 (from -r requirements.txt (line 11))  Downloading Scrapy-0.24.0-py2-none-any.whl (501kB)  Collecting service-identity==14.0.0 (from -r requirements.txt (line 12))  Downloading service_identity-14.0.0-py2.py3-none-any.whl  Collecting dumptruck>=0.1.2 (from scraperwiki->-r requirements.txt (line 6))  Downloading dumptruck-0.1.6.tar.gz  Collecting requests (from scraperwiki->-r requirements.txt (line 6))  Downloading requests-2.7.0-py2.py3-none-any.whl (470kB)  Collecting selenium>=2.47.1 (from splinter>=0.7.3->-r requirements.txt (line 10))  Downloading selenium-2.47.1-py2-none-any.whl (3.0MB)  Collecting queuelib (from scrapy==0.24->-r requirements.txt (line 11))  Downloading queuelib-1.3.0-py2.py3-none-any.whl  Collecting pyOpenSSL (from scrapy==0.24->-r requirements.txt (line 11))  Downloading pyOpenSSL-0.15.1-py2.py3-none-any.whl (102kB)  Collecting Twisted>=10.0.0 (from scrapy==0.24->-r requirements.txt (line 11))  Downloading Twisted-15.4.0.tar.bz2 (3.1MB)  Collecting six>=1.5.2 (from scrapy==0.24->-r requirements.txt (line 11))  Downloading six-1.9.0-py2.py3-none-any.whl  Collecting w3lib>=1.2 (from scrapy==0.24->-r requirements.txt (line 11))  Downloading w3lib-1.12.0-py2.py3-none-any.whl  Collecting characteristic>=14.0.0 (from service-identity==14.0.0->-r requirements.txt (line 12))  Downloading characteristic-14.3.0-py2.py3-none-any.whl  Collecting pyasn1-modules (from service-identity==14.0.0->-r requirements.txt (line 12))  Downloading pyasn1-modules-0.0.7.tar.gz  Collecting pyasn1 (from service-identity==14.0.0->-r requirements.txt (line 12))  Downloading pyasn1-0.1.8.tar.gz (75kB)  Collecting cryptography>=0.7 (from pyOpenSSL->scrapy==0.24->-r requirements.txt (line 11))  Downloading cryptography-1.0.tar.gz (331kB)  Collecting zope.interface>=3.6.0 (from Twisted>=10.0.0->scrapy==0.24->-r requirements.txt (line 11))  Downloading zope.interface-4.1.2.tar.gz (919kB)  Collecting idna>=2.0 (from cryptography>=0.7->pyOpenSSL->scrapy==0.24->-r requirements.txt (line 11))  Downloading idna-2.0-py2.py3-none-any.whl (61kB)  Collecting enum34 (from cryptography>=0.7->pyOpenSSL->scrapy==0.24->-r requirements.txt (line 11))  Downloading enum34-1.0.4.tar.gz  Collecting ipaddress (from cryptography>=0.7->pyOpenSSL->scrapy==0.24->-r requirements.txt (line 11))  Downloading ipaddress-1.0.14-py27-none-any.whl  Collecting cffi>=1.1.0 (from cryptography>=0.7->pyOpenSSL->scrapy==0.24->-r requirements.txt (line 11))  Downloading cffi-1.2.1.tar.gz (335kB)  Collecting pycparser (from cffi>=1.1.0->cryptography>=0.7->pyOpenSSL->scrapy==0.24->-r requirements.txt (line 11))  Downloading pycparser-2.14.tar.gz (223kB)  Installing collected packages: pycparser, cffi, ipaddress, enum34, idna, zope.interface, cryptography, pyasn1, pyasn1-modules, characteristic, w3lib, six, Twisted, pyOpenSSL, queuelib, selenium, requests, dumptruck, service-identity, scrapy, splinter, cssselect, lxml, scraperwiki  Running setup.py install for pycparser  Build the lexing/parsing tables  Running setup.py install for cffi  building '_cffi_backend' extension  gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -DUSE__THREAD -I/app/.heroku/python/include/python2.7 -c c/_cffi_backend.c -o build/temp.linux-x86_64-2.7/c/_cffi_backend.o  gcc -pthread -shared build/temp.linux-x86_64-2.7/c/_cffi_backend.o -lffi -o build/lib.linux-x86_64-2.7/_cffi_backend.so   Running setup.py install for enum34   Running setup.py install for zope.interface  building 'zope.interface._zope_interface_coptimizations' extension  gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/app/.heroku/python/include/python2.7 -c src/zope/interface/_zope_interface_coptimizations.c -o build/temp.linux-x86_64-2.7/src/zope/interface/_zope_interface_coptimizations.o  gcc -pthread -shared build/temp.linux-x86_64-2.7/src/zope/interface/_zope_interface_coptimizations.o -o build/lib.linux-x86_64-2.7/zope/interface/_zope_interface_coptimizations.so  Skipping installation of /app/.heroku/python/lib/python2.7/site-packages/zope/__init__.py (namespace package)  Installing /app/.heroku/python/lib/python2.7/site-packages/zope.interface-4.1.2-py2.7-nspkg.pth  Running setup.py install for cryptography  generating cffi module 'build/temp.linux-x86_64-2.7/_padding.c'  generating cffi module 'build/temp.linux-x86_64-2.7/_constant_time.c'  generating cffi module 'build/temp.linux-x86_64-2.7/_openssl.c'  building '_openssl' extension  gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/app/.heroku/python/include/python2.7 -c build/temp.linux-x86_64-2.7/_openssl.c -o build/temp.linux-x86_64-2.7/build/temp.linux-x86_64-2.7/_openssl.o  gcc -pthread -shared build/temp.linux-x86_64-2.7/build/temp.linux-x86_64-2.7/_openssl.o -lssl -lcrypto -o build/lib.linux-x86_64-2.7/cryptography/hazmat/bindings/_openssl.so  building '_constant_time' extension  gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/app/.heroku/python/include/python2.7 -c build/temp.linux-x86_64-2.7/_constant_time.c -o build/temp.linux-x86_64-2.7/build/temp.linux-x86_64-2.7/_constant_time.o  gcc -pthread -shared build/temp.linux-x86_64-2.7/build/temp.linux-x86_64-2.7/_constant_time.o -o build/lib.linux-x86_64-2.7/cryptography/hazmat/bindings/_constant_time.so  building '_padding' extension  gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/app/.heroku/python/include/python2.7 -c build/temp.linux-x86_64-2.7/_padding.c -o build/temp.linux-x86_64-2.7/build/temp.linux-x86_64-2.7/_padding.o  gcc -pthread -shared build/temp.linux-x86_64-2.7/build/temp.linux-x86_64-2.7/_padding.o -o build/lib.linux-x86_64-2.7/cryptography/hazmat/bindings/_padding.so  Running setup.py install for pyasn1  Running setup.py install for pyasn1-modules     Running setup.py install for Twisted  gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/app/.heroku/python/include/python2.7 -c conftest.c -o conftest.o  building 'twisted.test.raiser' extension  gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/app/.heroku/python/include/python2.7 -c twisted/test/raiser.c -o build/temp.linux-x86_64-2.7/twisted/test/raiser.o  gcc -pthread -shared build/temp.linux-x86_64-2.7/twisted/test/raiser.o -o build/lib.linux-x86_64-2.7/twisted/test/raiser.so  building 'twisted.python._sendmsg' extension  gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/app/.heroku/python/include/python2.7 -c twisted/python/_sendmsg.c -o build/temp.linux-x86_64-2.7/twisted/python/_sendmsg.o  gcc -pthread -shared build/temp.linux-x86_64-2.7/twisted/python/_sendmsg.o -o build/lib.linux-x86_64-2.7/twisted/python/_sendmsg.so  building 'twisted.runner.portmap' extension  gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/app/.heroku/python/include/python2.7 -c twisted/runner/portmap.c -o build/temp.linux-x86_64-2.7/twisted/runner/portmap.o  gcc -pthread -shared build/temp.linux-x86_64-2.7/twisted/runner/portmap.o -o build/lib.linux-x86_64-2.7/twisted/runner/portmap.so  changing mode of build/scripts-2.7/tkconch from 644 to 755  changing mode of build/scripts-2.7/conch from 644 to 755  changing mode of build/scripts-2.7/pyhtmlizer from 644 to 755  changing mode of build/scripts-2.7/tap2rpm from 644 to 755  changing mode of build/scripts-2.7/manhole from 644 to 755  changing mode of build/scripts-2.7/ckeygen from 644 to 755  changing mode of build/scripts-2.7/tap2deb from 644 to 755  changing mode of build/scripts-2.7/mailmail from 644 to 755  changing mode of build/scripts-2.7/trial from 644 to 755  changing mode of build/scripts-2.7/twistd from 644 to 755  changing mode of build/scripts-2.7/cftp from 644 to 755  changing mode of /app/.heroku/python/bin/tkconch to 755  changing mode of /app/.heroku/python/bin/conch to 755  changing mode of /app/.heroku/python/bin/pyhtmlizer to 755  changing mode of /app/.heroku/python/bin/tap2rpm to 755  changing mode of /app/.heroku/python/bin/manhole to 755  changing mode of /app/.heroku/python/bin/ckeygen to 755  changing mode of /app/.heroku/python/bin/tap2deb to 755  changing mode of /app/.heroku/python/bin/mailmail to 755  changing mode of /app/.heroku/python/bin/trial to 755  changing mode of /app/.heroku/python/bin/twistd to 755  changing mode of /app/.heroku/python/bin/cftp to 755    Compiling /tmp/pip-build-cf1_q8/selenium/selenium/test/selenium/webdriver/browser_specific_template.py     Running setup.py install for dumptruck    Running setup.py install for splinter  Running setup.py install for cssselect  Running setup.py install for lxml  Building lxml version 3.4.4.  Building without Cython.  Using build configuration of libxslt 1.1.28  /app/.heroku/python/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url'  warnings.warn(msg)  building 'lxml.etree' extension  gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/usr/include/libxml2 -I/tmp/pip-build-cf1_q8/lxml/src/lxml/includes -I/app/.heroku/python/include/python2.7 -c src/lxml/lxml.etree.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -w  gcc -pthread -shared build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -lxslt -lexslt -lxml2 -lz -lm -o build/lib.linux-x86_64-2.7/lxml/etree.so  building 'lxml.objectify' extension  gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/usr/include/libxml2 -I/tmp/pip-build-cf1_q8/lxml/src/lxml/includes -I/app/.heroku/python/include/python2.7 -c src/lxml/lxml.objectify.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.objectify.o -w  gcc -pthread -shared build/temp.linux-x86_64-2.7/src/lxml/lxml.objectify.o -lxslt -lexslt -lxml2 -lz -lm -o build/lib.linux-x86_64-2.7/lxml/objectify.so  Running setup.py develop for scraperwiki  Creating /app/.heroku/python/lib/python2.7/site-packages/scraperwiki.egg-link (link to .)  Adding scraperwiki 0.3.7 to easy-install.pth file  Installed /app/.heroku/src/scraperwiki  Successfully installed Twisted-15.4.0 cffi-1.2.1 characteristic-14.3.0 cryptography-1.0 cssselect-0.9.1 dumptruck-0.1.6 enum34-1.0.4 idna-2.0 ipaddress-1.0.14 lxml-3.4.4 pyOpenSSL-0.15.1 pyasn1-0.1.8 pyasn1-modules-0.0.7 pycparser-2.14 queuelib-1.3.0 requests-2.7.0 scraperwiki scrapy-0.24.0 selenium-2.47.1 service-identity-14.0.0 six-1.9.0 splinter-0.7.3 w3lib-1.12.0 zope.interface-4.1.2  -----> Discovering process types  Procfile declares types -> scraper Injecting scraper and running... Scrapy 0.24.0 - no active project Unknown command: crawl Use "scrapy" to see available commands

Statistics

Average successful run time: 1 minute

Total run time: about 2 hours

Total cpu time used: 5 minutes

Total disk space used: 150 KB

History

  • Manually ran revision e3ae7399 and completed successfully .
    nothing changed in the database
  • Manually ran revision 1be7ae1d and completed successfully .
    nothing changed in the database
    2 pages scraped
  • Manually ran revision 3a86dd76 and failed .
    nothing changed in the database
  • Manually ran revision d161339d and failed .
    nothing changed in the database
    22 pages scraped
  • Manually ran revision d93dcbd0 and completed successfully .
    nothing changed in the database
    2 pages scraped
  • Manually ran revision 6c88c42b and completed successfully .
    nothing changed in the database
    2 pages scraped
  • Manually ran revision 37e79e80 and completed successfully .
    nothing changed in the database
    1 page scraped
  • Manually ran revision cbc4c83a and completed successfully .
    nothing changed in the database
    1 page scraped
  • Manually ran revision 9b69d5d6 and completed successfully .
    nothing changed in the database
    1 page scraped
  • Manually ran revision 9735f27b and failed .
    nothing changed in the database
    1 page scraped
  • Manually ran revision 6d75fb32 and failed .
    nothing changed in the database
    1 page scraped
  • Manually ran revision e1f596e8 and failed .
    nothing changed in the database
    1 page scraped
  • Manually ran revision bcec1199 and failed .
    nothing changed in the database
    1 page scraped
  • Manually ran revision f5820695 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision fa9069fd and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 2d103633 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 95ce2f52 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision d4b7c971 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 3ebd5e05 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 8522bd41 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 6ebe12c1 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision b98ab0d3 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision c020f329 and failed .
    nothing changed in the database
  • Manually ran revision 18bcb329 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision c0db2d07 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision ce974ca9 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 7e473dee and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision e7ffb576 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 428a04f9 and failed .
    nothing changed in the database
  • Manually ran revision 480a53f4 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 1c441d85 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 0ac5eb3b and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision d10057cc and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 5fe72c4b and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 8b653e54 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision c3f2e734 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision e5cc4f86 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 640580aa and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision d960ec48 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 1516ad36 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision ecba9d92 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 4f4cf011 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 4f4cf011 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision aaad0da5 and completed successfully .
    nothing changed in the database
    144 pages scraped
  • Manually ran revision 35a405e2 and completed successfully .
    nothing changed in the database
    89 pages scraped
  • Manually ran revision 0b47894f and completed successfully .
    nothing changed in the database
    89 pages scraped
  • Manually ran revision a4472f1f and failed .
    nothing changed in the database
    37 pages scraped
  • Manually ran revision 562d4425 and failed .
    nothing changed in the database
    26 pages scraped
  • Manually ran revision 9434f2b7 and failed .
    nothing changed in the database
    26 pages scraped
  • Manually ran revision b1adeea2 and failed .
    nothing changed in the database
    88 pages scraped
  • Manually ran revision ab6dd416 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 5dbdb4a1 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 3152ed54 and completed successfully .
    nothing changed in the database
    88 pages scraped
  • Manually ran revision ebc23b6c and completed successfully .
    nothing changed in the database
    74 pages scraped
  • Manually ran revision c4bebd9f and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision d95ff95e and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 1e4e37bc and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision a0653f91 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 16a55786 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 643516d8 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 13207518 and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 5f15348e and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision 682ed2cd and completed successfully .
    nothing changed in the database
    58 pages scraped
  • Manually ran revision f2317dc0 and completed successfully .
    nothing changed in the database
    57 pages scraped
  • Manually ran revision 5a0d876c and failed .
    nothing changed in the database
    35 pages scraped
  • Manually ran revision abf45994 and failed .
    nothing changed in the database
  • Manually ran revision 94cb1aa4 and failed .
    nothing changed in the database
  • Manually ran revision 2048dbfa and completed successfully .
    nothing changed in the database
    26 pages scraped
  • Manually ran revision 6d197048 and failed .
    nothing changed in the database
  • Manually ran revision 8fd0217b and failed .
    nothing changed in the database
  • Manually ran revision 01b532e5 and failed .
    nothing changed in the database
  • Manually ran revision 8d3cf6b8 and failed .
    nothing changed in the database
  • Manually ran revision 8d3cf6b8 and failed .
    nothing changed in the database
  • Manually ran revision 8324e567 and failed .
    nothing changed in the database
  • Created on morph.io

Scraper code

Python

cet_lebanon / scraper.py