-
-
Notifications
You must be signed in to change notification settings - Fork 178
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ChunkedEncodingError #393
Comments
Getting the same error daily: [04/23 17:56:21] ERROR | Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): [04/23 17:56:22] INFO | Crash saved as crashes/3.2.12_2024-04-23-17-56-21.zip [04/23 17:56:23] INFO | List of running apps: com.google.android.as, com.breel.wallpapers, com.qualcomm.qti.telephonyservice, com.google.android.apps.photos, com.google.android.cellbroadcastreceiver, com.google.vr.vrcore, com.google.android.gms, com.android.vending, com.google.android.euicc, com.google.android.apps.wellbeing, com.google.android.providers.media.module, com.instagram.android, com.google.android.apps.nexuslauncher, com.android.nfc, com.google.android.ext.services, com.qualcomm.qcrilmsgtunnel, com.android.systemui, com.android.se, com.android.phone, com.android.externalstorage, com.google.SSRestartDetector, com.github.uiautomator. |
Traceback (most recent call last):
File "C:\Users\umpalumpa\Desktop\bot\lib\site-packages\requests\models.py", line 816, in generate
yield from self.raw.stream(chunk_size, decode_content=True)
File "C:\Users\umpalumpa\Desktop\bot\lib\site-packages\urllib3\response.py", line 1040, in stream
yield from self.read_chunked(amt, decode_content=decode_content)
File "C:\Users\umpalumpa\Desktop\bot\lib\site-packages\urllib3\response.py", line 1184, in read_chunked
self._update_chunk_length()
File "C:\Users\umpalumpa\Desktop\bot\lib\site-packages\urllib3\response.py", line 1119, in _update_chunk_length
raise ProtocolError("Response ended prematurely") from None
urllib3.exceptions.ProtocolError: Response ended prematurely
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\umpalumpa\Desktop\bot\bot-master\run.py", line 3, in
GramAddict.run()
File "C:\Users\umpalumpa\Desktop\bot\bot-master\GramAddict_init_.py", line 10, in run
start_bot(**kwargs)
File "C:\Users\umpalumpa\Desktop\bot\bot-master\GramAddict\core\bot_flow.py", line 125, in start_bot
get_device_info(device)
File "C:\Users\umpalumpa\Desktop\bot\bot-master\GramAddict\core\device_facade.py", line 30, in get_device_info
f"Phone Name: {device.get_info()['productName']}, SDK Version: {device.get_info()['sdkInt']}"
File "C:\Users\umpalumpa\Desktop\bot\bot-master\GramAddict\core\device_facade.py", line 317, in get_info
return self.deviceV2.info
File "C:\Users\umpalumpa\Desktop\bot\lib\site-packages\uiautomator2_init_.py", line 448, in info
return self.jsonrpc.deviceInfo(http_timeout=10)
File "C:\Users\umpalumpa\Desktop\bot\lib\site-packages\uiautomator2_init_.py", line 479, in call
return self.server.jsonrpc_retry_call(self.method, params,
File "C:\Users\umpalumpa\Desktop\bot\lib\site-packages\uiautomator2_init.py", line 486, in jsonrpc_retry_call
return self.jsonrpc_call(*args, **kwargs)
File "C:\Users\umpalumpa\Desktop\bot\lib\site-packages\uiautomator2_init.py", line 513, in jsonrpc_call
res = self.http.post("/jsonrpc/0",
File "C:\Users\umpalumpa\Desktop\bot\lib\site-packages\requests\sessions.py", line 637, in post
return self.request("POST", url, data=data, json=json, **kwargs)
File "C:\Users\umpalumpa\Desktop\bot\lib\site-packages\uiautomator2_init.py", line 203, in request
return super().request(method, url, **kwargs)
File "C:\Users\umpalumpa\Desktop\bot\lib\site-packages\uiautomator2_init.py", line 119, in request
resp = super(TimeoutRequestsSession,
File "C:\Users\umpalumpa\Desktop\bot\lib\site-packages\requests\sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\umpalumpa\Desktop\bot\lib\site-packages\requests\sessions.py", line 747, in send
r.content
File "C:\Users\umpalumpa\Desktop\bot\lib\site-packages\requests\models.py", line 899, in content
self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
File "C:\Users\umpalumpa\Desktop\bot\lib\site-packages\requests\models.py", line 818, in generate
raise ChunkedEncodingError(e)
Hello
I have this error with this version: GramAddict v.3.2.12. How could I solve this?
Thank you
The text was updated successfully, but these errors were encountered: