Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve In context learning examples #118

Open
tannisroot opened this issue Apr 19, 2024 · 4 comments
Open

Improve In context learning examples #118

tannisroot opened this issue Apr 19, 2024 · 4 comments
Labels
enhancement New feature or request

Comments

@tannisroot
Copy link

Please do not request features for the model as an issue. You can refer to the pinned Discussion thread to make feature requests for the model/dataset.

Please describe what you are trying to do with the component
I'm trying to use a non-finetuned model (llama3) with this integration, and while it is shockingly smart, it fails with more complex calls it was not given examples for, like setting temperature (an attribute) for a climate entity.
To make it understand such calls, I had to manually edit the system prompt to include a custom example:
{"to_say": "Setting the temperature.", "service": "climate.set_temperature", "target_device": "climate.example", "temperature": 22 }
Which is not very convenient, and a new user won't know how to create a custom example like that since they are not familiar with the example format under the hood.
Another issue is that the integration will parse examples from .csv that don't reflect the types of entities actually exposed. It makes no sense to provide examples for lock entity types if a user has no lock entities.

Describe the solution you'd like
I think there are several improvents that can be done to the way ICL examples are handled

  1. Provide examples of calls with attributes and add those to the .csv.
  2. Only use examples for entity types that are actually exposed to the assistant.
  3. Instead of just having an options to use a custom .csv file, maybe present some sort of a table with all the examples parsed from the .csv, with modifiable values and with an option to add custom ones with the response style a user would prefer.

Additional context
As a workaround I just got rid of {{ response_examples }} from the system and give it my own ones, which unfortunately limits me to just a few static ones.

@tannisroot tannisroot added the enhancement New feature or request label Apr 19, 2024
@acon96
Copy link
Owner

acon96 commented Apr 19, 2024

2 should already be happening in the develop branch, 1 isn't done because I was being lazy when writing the examples, and 3 would need some frontend work to make it show up in the UI properly.

Honestly a lot of settings haven't been implemented properly because dynamic settings forms just aren't a thing without going and writing a React component myself.

@Subcode
Copy link

Subcode commented Apr 23, 2024

I have found that models perform much better when not telling it to follow a certain structure IE: "Respond to the following user instruction by responding in the same format as the following examples:".
Usually this gets results that are not great.

The solution to this is prompting it as if it has already given an answer a certain way. An example of this is:

User instruction: Turn off the fan.
{"to_say": "Switching off the fan as requested.", "service": "fan.turn_off", "target_device": "fan.ceiling_fan"}

User instruction: Play some music.
{"to_say": "Starting media playback.", "service": "media_player.media_play", "target_device": "media_player.bedroom"}

User instruction: {Actual instruction here}.

Since LLMs really just complete text the structure gets followed automatically and results end up great.
So replacing the CSV with a textfile that has this structure might improve quality on untrained models a lot.

@bennizone
Copy link

Hey, ich tried to generate an custom ICL File and have an Problem. The Answers have "," how ca I encapsule them?

@bwest2397
Copy link

@bennizone The ICL sample file is a .CSV file, so you can encapsulate sample responses with quotation marks if they have a comma. That is,

service,response
fan.turn_off,"Switching off the fan, as requested."

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants