Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UnicodeEncodeError: 'ascii' codec can't encode character '\u201d' in position 7: ordinal not in range(128) #159

Open
OfficerChul opened this issue Dec 19, 2024 · 4 comments

Comments

@OfficerChul
Copy link

I am keep getting the error on the title.

I am using python 3.13
using mac arm 64 anaconda.

@JUSTSUJAY
Copy link

Can you add some context to your issue?

@OfficerChul
Copy link
Author

sure.

I am trying to use gpt-4o-mini using aisuite.
What I am trying to do is send image to gpt to make text explanation of image, and it returns the same error as the title.

I have included more details of errors here. Please let me know if there is anything else I can add.

Error in function 'get_ai_generated_alt_text': 'ascii' codec can't encode character '\u201d' in position 7: ordinal not in range(128)
Full traceback:
Traceback (most recent call last):
  File "/Users/kyochul_jang/Desktop/Project/AltAuthor/backend/app/llm/client.py", line 182, in get_ai_generated_alt_text
    image_type, ai_generated_alt_text = await loop.run_in_executor(
                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/kyochul_jang/anaconda3/envs/hcclab/lib/python3.11/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/kyochul_jang/Desktop/Project/AltAuthor/backend/app/llm/client.py", line 183, in <lambda>
    None, lambda: make_request(image_url, alt_text, context)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/kyochul_jang/Desktop/Project/AltAuthor/backend/app/llm/client.py", line 136, in make_request
    response = client.chat.completions.create(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/kyochul_jang/anaconda3/envs/hcclab/lib/python3.11/site-packages/aisuite/client.py", line 117, in create
    return provider.chat_completions_create(model_name, messages, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/kyochul_jang/anaconda3/envs/hcclab/lib/python3.11/site-packages/aisuite/providers/openai_provider.py", line 29, in chat_completions_create
    return self.client.chat.completions.create(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/kyochul_jang/anaconda3/envs/hcclab/lib/python3.11/site-packages/openai/_utils/_utils.py", line 275, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/kyochul_jang/anaconda3/envs/hcclab/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 829, in create
    return self._post(
           ^^^^^^^^^^^
  File "/Users/kyochul_jang/anaconda3/envs/hcclab/lib/python3.11/site-packages/openai/_base_client.py", line 1280, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/kyochul_jang/anaconda3/envs/hcclab/lib/python3.11/site-packages/openai/_base_client.py", line 957, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/Users/kyochul_jang/anaconda3/envs/hcclab/lib/python3.11/site-packages/openai/_base_client.py", line 983, in _request
    request = self._build_request(options, retries_taken=retries_taken)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/kyochul_jang/anaconda3/envs/hcclab/lib/python3.11/site-packages/openai/_base_client.py", line 466, in _build_request
    headers = self._build_headers(options, retries_taken=retries_taken)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/kyochul_jang/anaconda3/envs/hcclab/lib/python3.11/site-packages/openai/_base_client.py", line 417, in _build_headers
    headers = httpx.Headers(headers_dict)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/kyochul_jang/anaconda3/envs/hcclab/lib/python3.11/site-packages/httpx/_models.py", line 156, in __init__
    bytes_value = _normalize_header_value(v, encoding)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/kyochul_jang/anaconda3/envs/hcclab/lib/python3.11/site-packages/httpx/_models.py", line 82, in _normalize_header_value
    return value.encode(encoding or "ascii")
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
UnicodeEncodeError: 'ascii' codec can't encode character '\u201d' in position 7: ordinal not in range(128)

@OfficerChul
Copy link
Author

I found the reason I figured it out. The reason was when I export the API key through terminal, I used double quotation mark and this caused the error. So I removed the double quotation mark and it worked.

@reproduce-bot
Copy link

The following script is generated by AI Agent to help reproduce the issue:

# aisuite/reproduce.py
import os
from unittest.mock import patch, MagicMock
from aisuite.providers.huggingface_provider import HuggingfaceProvider, LLMError

def test_unicode_handling():
    try:
        # Set the environment variable for the token
        os.environ["HF_TOKEN"] = "test-token"

        # Initialize the provider
        provider = HuggingfaceProvider()

        # Create a message with a Unicode character
        user_greeting = "Hello, how are you doing today? \u201d Unicode test: \u2603 \u2764 \u20AC"
        message_history = [{"role": "user", "content": user_greeting}]
        selected_model = "model-name"

        # Mock the httpx.post request to avoid actual network call
        mock_response = MagicMock()
        mock_response.json.return_value = {
            "choices": [{"message": {"content": "response-text"}}]
        }

        with patch("httpx.post", return_value=mock_response):
            # Force the encoding process to catch UnicodeEncodeError
            for message in message_history:
                message["content"].encode('ascii')

            response = provider.chat_completions_create(model=selected_model, messages=message_history)

            # Check if the response is handled correctly
            assert response.choices[0].message.content is not None
            assert False, "Expected UnicodeEncodeError but did not get it."
    except UnicodeEncodeError as e:
        raise AssertionError(e)
    except Exception as e:
        raise AssertionError(f"An unexpected error occurred: {e}")

if __name__ == "__main__":
    test_unicode_handling()

How to run:

python3 aisuite/reproduce.py

Expected Result:

Traceback (most recent call last):
  File "aisuite/reproduce.py", line 27, in test_unicode_handling
    message["content"].encode('ascii')
UnicodeEncodeError: 'ascii' codec can't encode character '\u201d' in position 32: ordinal not in range(128)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "aisuite/reproduce.py", line 40, in <module>
    test_unicode_handling()
  File "aisuite/reproduce.py", line 35, in test_unicode_handling
    raise AssertionError(e)
AssertionError: 'ascii' codec can't encode character '\u201d' in position 32: ordinal not in range(128)

Thank you for your valuable contribution to this project and we appreciate your feedback! Please respond with an emoji if you find this script helpful. Feel free to comment below if any improvements are needed.

Best regards from an AI Agent!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants