Injecting configuration and compiling...
[1G-----> Python app detected
[1G-----> Stack changed, re-installing runtime
[1G-----> Installing runtime (python-2.7.9)
[1G-----> Installing dependencies with pip
[1G Obtaining scraperwiki from git+http://github.com/openaustralia/scraperwiki-python.git@morph_defaults#egg=scraperwiki (from -r requirements.txt (line 6))
[1G Cloning http://github.com/openaustralia/scraperwiki-python.git (to morph_defaults) to ./.heroku/src/scraperwiki
[1G Collecting lxml==3.4.4 (from -r requirements.txt (line 8))
[1G Downloading lxml-3.4.4.tar.gz (3.5MB)
[1G Building lxml version 3.4.4.
[1G Building without Cython.
[1G Using build configuration of libxslt 1.1.28
[1G /app/.heroku/python/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url'
[1G warnings.warn(msg)
[1G Collecting cssselect==0.9.1 (from -r requirements.txt (line 9))
[1G Downloading cssselect-0.9.1.tar.gz
[1G Collecting beautifulsoup4 (from -r requirements.txt (line 10))
[1G Downloading beautifulsoup4-4.4.0-py2-none-any.whl (81kB)
[1G Collecting python-dateutil (from -r requirements.txt (line 11))
[1G Downloading python_dateutil-2.4.2-py2.py3-none-any.whl (188kB)
[1G Collecting dumptruck>=0.1.2 (from scraperwiki->-r requirements.txt (line 6))
[1G Downloading dumptruck-0.1.6.tar.gz
[1G Collecting requests (from scraperwiki->-r requirements.txt (line 6))
[1G Downloading requests-2.7.0-py2.py3-none-any.whl (470kB)
[1G Collecting six>=1.5 (from python-dateutil->-r requirements.txt (line 11))
[1G Downloading six-1.9.0-py2.py3-none-any.whl
[1G Installing collected packages: six, requests, dumptruck, python-dateutil, beautifulsoup4, cssselect, lxml, scraperwiki
[1G
[1G
[1G Running setup.py install for dumptruck
[1G
[1G
[1G Running setup.py install for cssselect
[1G Running setup.py install for lxml
[1G Building lxml version 3.4.4.
[1G Building without Cython.
[1G Using build configuration of libxslt 1.1.28
[1G /app/.heroku/python/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url'
[1G warnings.warn(msg)
[1G building 'lxml.etree' extension
[1G gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/usr/include/libxml2 -I/tmp/pip-build-OhtDA_/lxml/src/lxml/includes -I/app/.heroku/python/include/python2.7 -c src/lxml/lxml.etree.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -w
[1G gcc -pthread -shared build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -lxslt -lexslt -lxml2 -lz -lm -o build/lib.linux-x86_64-2.7/lxml/etree.so
[1G building 'lxml.objectify' extension
[1G gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/usr/include/libxml2 -I/tmp/pip-build-OhtDA_/lxml/src/lxml/includes -I/app/.heroku/python/include/python2.7 -c src/lxml/lxml.objectify.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.objectify.o -w
[1G gcc -pthread -shared build/temp.linux-x86_64-2.7/src/lxml/lxml.objectify.o -lxslt -lexslt -lxml2 -lz -lm -o build/lib.linux-x86_64-2.7/lxml/objectify.so
[1G Running setup.py develop for scraperwiki
[1G Creating /app/.heroku/python/lib/python2.7/site-packages/scraperwiki.egg-link (link to .)
[1G Adding scraperwiki 0.3.7 to easy-install.pth file
[1G Installed /app/.heroku/src/scraperwiki
[1G Successfully installed beautifulsoup4-4.4.0 cssselect-0.9.1 dumptruck-0.1.6 lxml-3.4.4 python-dateutil-2.4.2 requests-2.7.0 scraperwiki six-1.9.0
[1G
[1G-----> Discovering process types
[1G Procfile declares types -> scraper
Injecting scraper and running...
/app/.heroku/python/lib/python2.7/site-packages/bs4/__init__.py:166: UserWarning: No parser was explicitly specified, so I'm using the best available HTML parser for this system ("lxml"). This usually isn't a problem, but if you run this code on another system, or in a different virtual environment, it may use a different parser and behave differently.
To get rid of this warning, change this:
BeautifulSoup([your markup])
to this:
BeautifulSoup([your markup], "lxml")
markup_type=markup_type))
E0101_BNESC_gov_2015_01
E0101_BNESC_gov_2015_02
E0101_BNESC_gov_2015_03
E0101_BNESC_gov_2014_01
E0101_BNESC_gov_2014_02
E0101_BNESC_gov_2014_03
E0101_BNESC_gov_2014_04
E0101_BNESC_gov_2014_05
E0101_BNESC_gov_2014_06
E0101_BNESC_gov_2014_07
E0101_BNESC_gov_2014_08
E0101_BNESC_gov_2014_09
E0101_BNESC_gov_2014_10
E0101_BNESC_gov_2014_11
E0101_BNESC_gov_2014_12
E0101_BNESC_gov_2013_01
E0101_BNESC_gov_2013_02
E0101_BNESC_gov_2013_03
E0101_BNESC_gov_2013_04
E0101_BNESC_gov_2013_05
E0101_BNESC_gov_2013_06
E0101_BNESC_gov_2013_07
E0101_BNESC_gov_2013_08
E0101_BNESC_gov_2013_09
E0101_BNESC_gov_2013_10
E0101_BNESC_gov_2013_11
E0101_BNESC_gov_2013_12
E0101_BNESC_gov_2012_12
E0101_BNESC_gov_2012_11
E0101_BNESC_gov_2012_10
E0101_BNESC_gov_2012_09
E0101_BNESC_gov_2012_08
E0101_BNESC_gov_2012_07
E0101_BNESC_gov_2012_06
E0101_BNESC_gov_2012_05
E0101_BNESC_gov_2012_04
E0101_BNESC_gov_2012_03
E0101_BNESC_gov_2012_02
E0101_BNESC_gov_2012_01
E0101_BNESC_gov_2011_12
E0101_BNESC_gov_2011_11
E0101_BNESC_gov_2011_10
E0101_BNESC_gov_2011_09
E0101_BNESC_gov_2011_08
E0101_BNESC_gov_2011_07
E0101_BNESC_gov_2011_06
E0101_BNESC_gov_2011_05
E0101_BNESC_gov_2011_04
E0101_BNESC_gov_2011_03
E0101_BNESC_gov_2011_02
E0101_BNESC_gov_2011_01
E0101_BNESC_gov_2010_12