- remove official support for Python 2.7, 3.5, 3.6 and 3.7
- add official support for Python 3.11 and 3.12
- use the
SHUB_APIURL
andSHUB_STORAGE
environment variables as default values when available - fix the warning when
msgpack
is not installed (used to saymsgpack-python
)
- add Python 3.10 support
- improve
iter()
fallback in getting themeta
argument
- update msgpack dependency
- drop official support for Python 3.4
- improve documentation, address a few PEP8 complaints
- add items.list_iter method to iterate by chunks
- fix retrying logic for HTTP errors
- improve documentation
- use jobs.cancel command name to maintain consistency
- provide basic documentation for the new feature
- add a command to cancel multiple jobs per call
- normalize and simplify using VCR.py cassettes in tests
- add Python 3.7 support
- update msgpack dependency
- fix iter logic for items/requests/logs
- add truncate method to collections
- improve documentation
- add an option to schedule jobs with custom environment variables
- fallback to SHUB_JOBAUTH environment variable if SH_APIKEY is not set
- provide a unified connection timeout used by both internal clients
- increase a chunk size when working with the items stats endpoint
Python 3.3 is considered unmaintained.
- fix iter logic when applying single count param
- add support for TZ-aware datetime objects
- better tests
- add a client parameter to disable msgpack use
- add VCR.py json-serialized tests
- make parent param optional for requests.add
- improve documentation
Major release with a lot of new features.
- new powerfull ScrapinghubClient takes best from Connection and HubstorageClient, and combines it under single interface
- documentation is available on Read The Docs (2.0.0)
- python-hubstorage merged into python-scrapinghub
- all tests are improved and rewritten with py.test
- hubstorage tests use vcrpy cassettes, work faster and don't require any external services to run
python-hubstorage is going to be considered deprecated, its next version will contain a deprecation warning and a proposal to use python-scrapinghub >=1.9.0 instead.
- python 3 support & unittests
- add retries on httplib.HTTPException
- update scrapinghub api endpoint
- basic py3.3 compatibility while keeping py2.7 compatibility
- update setup.py classifiers
- fix travis workaround deploying on tags
- packaging improvements
- cleaner implementation of project.job()
- support retreiving a fixed amount of items
- switch to dash secure endpoint
- log download failure as error only if all attempts exhausted
- update travis config to match travis-ci (pypy updated to 2.2)
- update pypi credentials
- add python 3 to travis-ci matrix
- tox, travis-ci and pypi uploads
- pypi uploads only on Python 2.7 success
- run tests under pypy 2.1 in travis-ci
- add bindings for autoscraping api
- add a way to set starting offset
- suport requesting meta fields
- resume item downloads on network errors
- add support for stopping a job
- project.name is deprecated in favour of project.id
- use stricter arguments for Connection constructor
- point to dash.scrapinghub.com api endpoint by default
- enable streaming with requests >= 1.0
- added automatic retry to items download, when the request fails
- report correct version on user-agent string
- ported to uses Requests library (instead of urllib2)
- added support for gzip transfer encoding to increase API throughput on low bandwidth connections
- deprecated first url argument of scrapinghub.Connection object
- added support for loading API key from SH_APIKEY environment variable
First release of python-scrapinghub.