-
Notifications
You must be signed in to change notification settings - Fork 128
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Json Decode Error #267
Comments
when it will be fixed? |
Same issue here |
I have the same problem, when will this be fixed? |
Same issue here—any update? |
also experiencing this |
Hi all, I solved the JSON parsing issue in get_remote_llms (in hugchat.py) by implementing chunk-by-chunk parsing |
How did you do |
@Aravind4525 AI-based solution: Tested this and it works 🎉 |
Facing the same error. Says "Extra data: line 2 column 1 (char 11877)". |
Hello, the bug has been fixed by @manuthecoder , please update to v0.4.12 for the solution. Thanks @manuthecoder ❤ |
upgrading to v0.4.12 didn't work for me. There are three lines in the r.text response. When get_remote_llms is looping through "lines" the first line works.. the 2nd line throws and exception as there is no "nodes" element in the chunk. Any ideas on how to resolve? |
It seems that there have been some updates where some calls now return multiple non-uniform json objects in a single call across multiple lines/chunks. This is causing the JSON Decode errors. So far, I've seen both get_remote_llms() and get_remote_conversations() are affected. For get_remote_conversations() the conversation information is now passed in the second line/chunk. If anyone has the time to fork, validate and update.. here is the updated function:
|
I tried it, and now "/switch all" gives me "# Error: 'id:", running just "/switch" works though. |
Is there any ETA for the solution? I've tried several solutions published here, but none of them work for me. |
Found the problem, it will be fixed in 1 day. |
the problem is UPDATE: |
v0.4.15 published and truly fixed the problem. |
IT DOESN'T WORKS |
@Soulter thank u for your work! |
@Soulter Thanks for your work on this. In the case of get_remote_conversations() the conversation data has been moved to the 2nd JSON-object not the 1st. I provided a quick workaround above: Hope this helps. |
Thanks |
v0.4.16 published and fixed the problem. |
Still "/switch all" gives me "# Error: 'id:". |
The id problem persists folks: "id=data['data'][conversation_data["id"]] KeyError: 'id' " |
The issue is caused due to another server side change where hf have re-integrated the conversation data into the main chunk. The 2nd chunk where they moved it to temporarily is now empty: @Soulter it looks like the conversation data has been put back in it's original location in the main json object. There are still multiple json objects in __data.json.. it's now in the 1st instead of the second.. your previous code for extracting the conversations works.. here is an updated get_remote_conversations():
|
thanks @helqasem it looks like it is working now! |
Traceback (most recent call last):
File "C:\Users\Admin\PycharmProjects\New_PiCourseSearch\venv\Lib\site-packages\requests\models.py", line 971, in json
return complexjson.loads(self.text, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\Python\Python311\Lib\json_init_.py", line 346, in loads
return _default_decoder.decode(s)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\Python\Python311\Lib\json\decoder.py", line 340, in decode
raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 2 column 1 (char 11956)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\Admin\PycharmProjects\New_PiCourseSearch\venv\Lib\site-packages\requests\models.py", line 975, in json
raise RequestsJSONDecodeError(e.msg, e.doc, e.pos)
requests.exceptions.JSONDecodeError: Extra data: line 2 column 1 (char 11956)
In my local i am getting this error at this cmd - chatbot = hugchat.ChatBot(cookies=cookies.get_dict())
The text was updated successfully, but these errors were encountered: