Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Method to iterate all jobs/items/whatever #137

Open
andrewbaxter opened this issue Nov 13, 2019 · 5 comments
Open

Method to iterate all jobs/items/whatever #137

andrewbaxter opened this issue Nov 13, 2019 · 5 comments

Comments

@andrewbaxter
Copy link

The AWS boto client has a Paginator to help iterating within api result limits, which while clunky is very nice to have since it's hard to get it wrong.

A method to iterate/list all results or else a Paginator that hides the pagination parameters which are easy to get wrong would be super helpful. The 1000 job limit is an issue in all projects I've needed to use this in.

@andrewbaxter
Copy link
Author

I think this might be covered by #133 ? I don't see it in the docs yet so I assume that hasn't been released.

@noviluni
Copy link
Contributor

Hi @andrewbaxter
I've seen that the new version was released 3 days ago and this covers the new implemented method in #133: https://python-scrapinghub.readthedocs.io/en/latest/client/apidocs.html#scrapinghub.client.items.Items.list_iter

Is that what you were asking for?

@andrewbaxter
Copy link
Author

Yeah, looks like exactly it! Thanks!

@hermit-crab
Copy link
Contributor

The Jobs iterator is not covered by that as far as I recall, apologies if wrong. And the items/logs/requests were already unlimited before that change. @andrewbaxter @noviluni

@andrewbaxter
Copy link
Author

andrewbaxter commented Dec 20, 2019

Ah, you're right. And that was why I was getting confused about the talk about increasing the chunksize argument. Thank you for setting be back on track here.

I think iterating jobs was what prompted me to open this in the first place.

@andrewbaxter andrewbaxter reopened this Dec 20, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants