Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: "The model produces invalid content" #154

Open
kevin-support-bot bot opened this issue Dec 28, 2024 · 35 comments
Open

[Bug]: "The model produces invalid content" #154

kevin-support-bot bot opened this issue Dec 28, 2024 · 35 comments

Comments

@kevin-support-bot
Copy link

All-Hands-AI#5876 Issue


@avi12 Could you isolate these params and replicate the issue?

@avi12
Copy link

avi12 commented Dec 28, 2024

I'm sorry, I didn't understand
What Docker command should I use?

@SmartManoj
Copy link
Owner

Using make run.

@avi12
Copy link

avi12 commented Dec 28, 2024

so do I just do

git pull
make run

@SmartManoj
Copy link
Owner

  1. git pull

  2. make build for the first time.

  3. make run

@avi12
Copy link

avi12 commented Dec 28, 2024

the make build resulted in

�[32mBuilding project...�(B�[m
�[33mChecking dependencies...�(B�[m
�[33mChecking system...�(B�[m
�[36mLinux detected.�(B�[m
�[33mChecking Python installation...�(B�[m
�[36mPython 3.12.8 is already installed.�(B�[m
�[33mChecking npm installation...�(B�[m
�[36mnpm 10.7.0 is already installed.�(B�[m
�[33mChecking Node.js installation...�(B�[m
�[36mNode.js 20.14.0 is already installed.�(B�[m
�[33mChecking Docker installation...�(B�[m
�[36mDocker version 27.4.0, build bde2b89 is already installed.�(B�[m
�[33mChecking Poetry installation...�(B�[m
�[36mPoetry (version 1.8.5) is already installed.�(B�[m
�[32mDependencies checked successfully.�(B�[m
�[32mInstalling Python dependencies...�(B�[m
Defaulting TZ (timezone) to UTC
Using virtualenv: /home/avi12/.cache/pypoetry/virtualenvs/openhands-ai-d2fGaaAX-py3.12
Installing dependencies from lock file

Package operations: 1 install, 0 updates, 0 removals

  - Installing pylcs (0.1.1)

  ChefBuildError

  Backend subprocess exited when trying to invoke build_wheel
  
  running bdist_wheel
  running build
  running build_py
  creating build/lib.linux-x86_64-cpython-312/pylcs
  copying pylcs/__init__.py -> build/lib.linux-x86_64-cpython-312/pylcs
  running build_ext
  creating tmp
  x86_64-linux-gnu-g++ -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -fPIC -I/tmp/tmpuz1kiayy/.venv/include -I/usr/include/python3.12 -c /tmp/tmp56fhgtq5.cpp -o tmp/tmp56fhgtq5.o -std=c++14
  x86_64-linux-gnu-g++ -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -fPIC -I/tmp/tmpuz1kiayy/.venv/include -I/usr/include/python3.12 -c /tmp/tmp1lbnsoaa.cpp -o tmp/tmp1lbnsoaa.o -std=c++11
  Traceback (most recent call last):
    File "/home/avi12/.local/share/pypoetry/venv/lib/python3.10/site-packages/pyproject_hooks/_in_process/_in_process.py", line 389, in <module>
      main()
    File "/home/avi12/.local/share/pypoetry/venv/lib/python3.10/site-packages/pyproject_hooks/_in_process/_in_process.py", line 373, in main
      json_out["return_val"] = hook(**hook_input["kwargs"])
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "/home/avi12/.local/share/pypoetry/venv/lib/python3.10/site-packages/pyproject_hooks/_in_process/_in_process.py", line 280, in build_wheel
      return _build_backend().build_wheel(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/build_meta.py", line 435, in build_wheel
      return _build(['bdist_wheel'])
             ^^^^^^^^^^^^^^^^^^^^^^^
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/build_meta.py", line 426, in _build
      return self._build_with_temp_dir(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/build_meta.py", line 407, in _build_with_temp_dir
      self.run_setup()
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/build_meta.py", line 320, in run_setup
      exec(code, locals())
    File "<string>", line 91, in <module>
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/__init__.py", line 117, in setup
      return distutils.core.setup(**attrs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/_distutils/core.py", line 183, in setup
      return run_commands(dist)
             ^^^^^^^^^^^^^^^^^^
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/_distutils/core.py", line 199, in run_commands
      dist.run_commands()
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 954, in run_commands
      self.run_command(cmd)
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/dist.py", line 995, in run_command
      super().run_command(command)
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 973, in run_command
      cmd_obj.run()
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/command/bdist_wheel.py", line 381, in run
      self.run_command("build")
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/_distutils/cmd.py", line 316, in run_command
      self.distribution.run_command(command)
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/dist.py", line 995, in run_command
      super().run_command(command)
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 973, in run_command
      cmd_obj.run()
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/_distutils/command/build.py", line 135, in run
      self.run_command(cmd_name)
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/_distutils/cmd.py", line 316, in run_command
      self.distribution.run_command(command)
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/dist.py", line 995, in run_command
      super().run_command(command)
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/_distutils/dist.py", line 973, in run_command
      cmd_obj.run()
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/command/build_ext.py", line 99, in run
      _build_ext.run(self)
    File "/tmp/tmpuz1kiayy/.venv/lib/python3.12/site-packages/setuptools/_distutils/command/build_ext.py", line 359, in run
      self.build_extensions()
    File "<string>", line 79, in build_extensions
    File "<string>", line 60, in cpp_flag
  RuntimeError: Unsupported compiler -- at least C++11 support is needed!
  

  at ~/.local/share/pypoetry/venv/lib/python3.10/site-packages/poetry/installation/chef.py:164 in _prepare
      160Γöé 
      161Γöé                 error = ChefBuildError("\n\n".join(message_parts))
      162Γöé 
      163Γöé             if error is not None:
    → 164│                 raise error from None
      165Γöé 
      166Γöé             return path
      167Γöé 
      168Γöé     def _prepare_sdist(self, archive: Path, destination: Path | None = None) -> Path:

Note: This error originates from the build backend, and is likely not a problem with poetry but with pylcs (0.1.1) not supporting PEP 517 builds. You can verify this by running 'pip wheel --no-cache-dir --use-pep517 "pylcs (==0.1.1)"'.

@SmartManoj
Copy link
Owner

SmartManoj commented Dec 28, 2024

sudo apt install build-essential -y - Source

@avi12
Copy link

avi12 commented Dec 28, 2024

Alright, make build and make run worked, thanks!

@avi12
Copy link

avi12 commented Dec 28, 2024

I have no idea what just happened

2024-12-28.12-57-25.msedge.mp4

@SmartManoj
Copy link
Owner

Is there any error in the terminal?

@avi12
Copy link

avi12 commented Dec 28, 2024

image

@SmartManoj
Copy link
Owner

SmartManoj commented Dec 28, 2024

Run sudo kill -9 $(sudo lsof -t -i:3000) to kill process running in 3000 port.

And could you check in localhost:3000?

@avi12
Copy link

avi12 commented Dec 28, 2024

The only instance using port 3000 is OH running via the Docker command
Plus, the OH instance built from source is running at 127.0.0.1:3001 without issues

@SmartManoj
Copy link
Owner

SmartManoj commented Dec 28, 2024

Could you change the BACKEND_PORT to 3002 here and check http://localhost:3002?
3001 is for the frontend server for development which may be slower than the build version.

@avi12
Copy link

avi12 commented Dec 28, 2024

I terminated the Docker container and instead started the instance from source at port 3000, though now I'm running into a different issue of the LLM's context getting lost
image

Note how in the prompt I specifically pointed at /firebase/firebase.json, but the Python command tried to access /workspace/firebase.json

@SmartManoj
Copy link
Owner

SmartManoj commented Dec 28, 2024

All files will be in the /workspace folder. Isn't the correct path /workspace/firebase/firebase.json?

@avi12
Copy link

avi12 commented Dec 28, 2024

The path is /mnt/c/repositories/extensions/youtube-time-manager/firebase/firebase.json

@avi12
Copy link

avi12 commented Dec 28, 2024

The config is

[core]
workspace_base="./workspace"
workspace_mount_path="/mnt/c/repositories/extensions/youtube-time-manager"
debug=true

[llm]
model="gpt-4o"
api_key="KEY"
base_url="https://api.openai.com/v1"
embedding_model="openai"

what did I configure incorrectly?

@SmartManoj
Copy link
Owner

workspace_base="/mnt/c/repositories/extensions/youtube-time-manager"

workspace_mount_path will be automatically set to workspace_base.

@SmartManoj
Copy link
Owner

SmartManoj commented Dec 28, 2024

The path is /mnt/c/repositories/extensions/youtube-time-manager/firebase/firebase.json

this is the local path.

In the frontend terminal, could you run ls /workspace/firebase/firebase.json?

@avi12
Copy link

avi12 commented Dec 28, 2024

I see thanks for pointing that out north
At first I consider the build from source as if it's operating similarly to Docker's virtualization
To my understanding, the top part of the config needs to be changed to

[core]
workspace_base="/mnt/c/repositories/extensions/youtube-time-manager"
debug=true

@SmartManoj
Copy link
Owner

the top part of the config needs to be changed to

Yes.

@avi12
Copy link

avi12 commented Dec 28, 2024

Now, despite having the API key configured, I get this issue

2024-12-28.13-50-50.msedge.mp4

Then, when the agent is initialized, it tries to send a request to the OpenAI server, but because the LLM key isn't defined in the local storage, I get an error

@SmartManoj
Copy link
Owner

Could you set it like this in the config.toml?

[llm]
model = 'groq/gemma2-9b-it'
api_key = 'key'

@avi12
Copy link

avi12 commented Dec 28, 2024

Do I also need to set base_url?

@SmartManoj
Copy link
Owner

SmartManoj commented Dec 28, 2024

Nope. It will use the default value for the provider.

@avi12
Copy link

avi12 commented Dec 28, 2024

Is this the right way to configure it?

[core]
workspace_base="/mnt/c/repositories/extensions/youtube-time-manager"
debug=true

[llm]
model="openai/gpt-4o"
api_key="KEY"

@SmartManoj
Copy link
Owner

Yes.

@avi12
Copy link

avi12 commented Dec 28, 2024

I tried emptying out the local storage but it seems like it's not taking the values from the config
image

@SmartManoj
Copy link
Owner

SmartManoj commented Dec 28, 2024

You can simply click save and run. key will be taken from the config file. UI won't be updated.

@avi12
Copy link

avi12 commented Dec 28, 2024

When I clicked Save this modal went away, but when I sent the prompt, after a brief moment that was logged to the console

14:12:33 - openhands:ERROR: agent_controller.py:217 - [Agent Controller 0f20d289ee864f29ae2bc5273c330c1b] Error while running the agent: litellm.AuthenticationError: AnthropicException - {"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"}}

@SmartManoj
Copy link
Owner

Could you comment on these 2 lines and check?

@avi12
Copy link

avi12 commented Dec 28, 2024

Seems to have worked

@avi12
Copy link

avi12 commented Dec 28, 2024

Also, apparently in the config file I need to set the path to the file system, but if I want to reference any file in the prompt, it has to be in the "virtual" one, under /workspace
Then, when the agent attempts to execute either a Python or a Bash command that references files under /workspace, the files mapped to the local file system will be affected

@SmartManoj
Copy link
Owner

It's for ease of use.
You could mount the directory as read only.

@avi12
Copy link

avi12 commented Dec 28, 2024

Having the agent modify files directly is what I want so that it's easier to keep track of what it's doing, compare diffs, and if I need to, rollback

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants