Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support "percentage" attribute (e.g. set_percentage for fan speed) in Home-LLM API #201

Open
tarocco opened this issue Aug 12, 2024 · 0 comments
Labels
enhancement New feature or request

Comments

@tarocco
Copy link

tarocco commented Aug 12, 2024

I am having difficulty controlling the speed of my ceiling fan via the conversation agent. I make a request to change the fan speed and it fails. Checking the logs, I find

...
<|user|>
 Set the ceiling fan speed to 100%.<|endoftext|>
<|assistant|>

2024-08-12 12:34:30.550 DEBUG (MainThread) [custom_components.llama_conversation.agent] {'model': 'fixt/home-3b-v3:latest', 'created_at': '2024-08-12T19:34:30.460098Z', 'response': 'setting the fan to 100 mph.\n```homeassistant\n{"service": "fan.set_speed", "target_device": "fan.ceiling_fan", "speed": 100}\n```', 'done': True, 'done_reason': 'stop', 'total_duration': 39339809411, 'load_duration': 11184966, 'prompt_eval_count': 1508, 'prompt_eval_duration': 35013075000, 'eval_count': 44, 'eval_duration': 4312856000}
2024-08-12 12:34:30.550 DEBUG (MainThread) [custom_components.llama_conversation.agent] setting the fan to 100 mph.
```homeassistant
{"service": "fan.set_speed", "target_device": "fan.ceiling_fan", "speed": 100}
```

100 mph seems a bit fast, but so far so good.
But then, this error:

2024-08-12 12:34:30.552 INFO (MainThread) [custom_components.llama_conversation.agent] LLM produced an improperly formatted response: MultipleInvalid([Invalid('extra keys not allowed')])

Looking up this error message, I find that this happens due to the Home-LLM API rejecting the "speed" key of the service call. The following is the code responsible for this validation:

if llm_api.api.id == HOME_LLM_API_ID:
schema_to_validate = vol.Schema({
vol.Required('service'): str,
vol.Required('target_device'): str,
vol.Optional('rgb_color'): str,
vol.Optional('brightness'): float,
vol.Optional('temperature'): float,
vol.Optional('humidity'): float,
vol.Optional('fan_mode'): str,
vol.Optional('hvac_mode'): str,
vol.Optional('preset_mode'): str,
vol.Optional('duration'): str,
vol.Optional('item'): str,
})

I am not sure if a "speed" parameter is supported by HA's LLM ToolInput, but it would be great if there was a fan speed attribute, or maybe an option to allow non-validated parsed tool args. But I don't know how practical this is or if it would even work. I did not try it.

The way to set the fan speed is via the set_percentage service call/action. Is this a situation where a prompt should be used to guide the LLM to choose the appropriate action for the tool call?

I didn't have debug logging enabled for the times I did successfully change the fan speed via LLM, but I will try to reproduce it and compare the tool call.

Requests work when they are phrased specifically in accordance with the HA tool call attributes and integration action parameters. For instance, this works:

<|user|>
 Increase the percentage of the ceiling fan.<|endoftext|>
<|assistant|>

2024-08-12 14:09:31.798 DEBUG (MainThread) [custom_components.llama_conversation.agent] {'model': 'fixt/home-3b-v3:latest', 'created_at': '2024-08-12T21:09:31.644875Z', 'response': 'increasing the speed of ceiling fan for you.\n```homeassistant\n{"service": "fan.increase_speed", "target_device": "fan.ceiling_fan"}\n```', 'done': True, 'done_reason': 'stop', 'total_duration': 36468464136, 'load_duration': 8485781, 'prompt_eval_count': 1412, 'prompt_eval_duration': 32360429000, 'eval_count': 42, 'eval_duration': 4097062000}
2024-08-12 14:09:31.799 DEBUG (MainThread) [custom_components.llama_conversation.agent] increasing the speed of ceiling fan for you.
```homeassistant
{"service": "fan.increase_speed", "target_device": "fan.ceiling_fan"}
```
2024-08-12 14:09:31.800 INFO (MainThread) [custom_components.llama_conversation.agent] calling tool: {"service": "fan.increase_speed", "target_device": "fan.ceiling_fan"}

2024-08-12 14:09:31.803 DEBUG (MainThread) [custom_components.llama_conversation.agent] Tool response: {'result': 'success'}

But most of my requests to set the percentage either fail or are aliased to increase/decrease "percentage". E.g. I cannot set the ceiling fan from off to medium speed without making 2 separate requests. And any mention of "speed" seems to end up as a failed tool call.

Is this something that could be improved in code, prompt, model, or other?

Stack and settings:
Raspberry Pi 4 Model B (8GB)
Core 2024.7.0
Supervisor 2024.08.0
Operating System 12.4
Frontend 20240703.0
home-llm 0.3.3
Ollama API (remote API: processing offloaded to another device on the network)
fixt/home-3b-v3 q4_k_m (b682128e2534)
LLM API: Home-LLM
Other service settings using default values.

@tarocco tarocco added the enhancement New feature or request label Aug 12, 2024
@tarocco tarocco changed the title Support "speed" attribute (e.g. fan speed) in Home-LLM API Support "percentage" attribute (e.g. set_percentage for fan speed) in Home-LLM API Aug 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant