-
Notifications
You must be signed in to change notification settings - Fork 53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RATE_LIMIT_EXCEEDED with df_to_sheet #94
Comments
Hi! What kind of operations are you doing in these daily updates (what parameters are you using)? There's a lot of requests happening for simple things just to have metadata up to date. For example, any request that changes frozen rows/cols, merged rows/cols, etc will refresh metadata. Using There's definitely ways to improve the current flow by using batch requests, and maybe having some sort of flag to avoid refreshing metadata automatically unless it absolutely needs to. |
Hi! A typical call would be: spread.df_to_sheet(upl_df,replace=True,sheet=gsheet_wks,index=False,headers=True,freeze_headers=True) The dataframe can range from (100,100) to (100 000,10) in size. I do purge every sheet on every call (since there is no uniqu id one can use in gsheets), so I suspect this is the issue. |
Sorry for the delayed response, I must not have seen the notification. There's a lot of requests going on when using all those options... I think the number of requests could be lowered by batching, but that's likely a pretty major refactor. Instead of each function making the necessary requests, we'd need to maintain some sort of queue of batch requests and send them all at once. Unfortunately I don't have much time anymore to make large changes like these, but if you wanted to try to give it a go it'd be greatly appreciated! |
No problem! Thanks for the response. That sounds like a valid approach. I will see what I can do. Thanks again for an awesome library though! |
Hi!
We have quite a few pandas dataframes with quite a high number of cells that we update daily. These are then uploaded to gsheets via df_to_sheet, but we almost always hit the rate_limit_exceeded error by Google. I suspect this is hitting the rate limit per user per minute as defined here - https://developers.google.com/sheets/api/limits.
I suspect this is because df_to_sheets hits the Gsheet API hard - is there any way to "spread" requests? I know about the exponential backoff algo, but this is not ideal in our situation.
The text was updated successfully, but these errors were encountered: