Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Anthropic support #222

Merged
merged 1 commit into from
Aug 9, 2024
Merged

Anthropic support #222

merged 1 commit into from
Aug 9, 2024

Conversation

dmugtasimov
Copy link
Contributor

@dmugtasimov dmugtasimov commented Aug 8, 2024

Implementation of #221

Both Anthropic and OpenAI are supported on LLMClient level. Other prompts can be switched to Anthropic just by changing prompt parameters in PromptLayer (did not test, but should work)

@dmugtasimov dmugtasimov mentioned this pull request Aug 8, 2024
README.md Outdated
@@ -7,7 +7,10 @@

# Initial Project Setup

1. Install Poetry
1. Install Python version 3.10.13 and make sure it is being used the next step and later during development
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Other 3.10.x versions are also fine. The exact version is added for consistency

input_variables={'description': self.description},
tracked_user=self.user,
tags=['manual_contribution_assessment'],
**make_prompt_kwargs(settings.GITHUB_MANUAL_CONTRIBUTION_ASSESSMENT_PROMPT_NAME),
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This helps development and debug where a developer can use a prompt with a particular tag by overriding a value in settings.dev.py



def is_ia(author):
return author.id == settings.IA_DISCORD_USER_ID
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This fixes the issue where anyone impersonate for IA just by having _ai in their username. It also allows not to force developers to have a bot with _ai string in their name

content = message.content

if (role := map_author_structured(message.author)) == prev_role:
# We need to merge messages to prevent the following error from Anthropic
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Anthropic has its limitations. We may need similar thing in the other place of IA interaction if we decide to use Anthropic there

@@ -32,28 +75,25 @@ async def on_message_implementation(message):
await message.reply('Please, register at https://thenewboston.com')
return

# TODO(dmu) MEDIUM: Request message history just once and convert it to necessary format before LLM call
plain_text_message_history = await get_plain_text_message_history(message.channel)
messages = (await get_historical_messages(message.channel))[::-1]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Now we query historical messages just once

@@ -20,6 +20,49 @@

bot = commands.Bot('/', intents=intents)

# TODO(dmu) HIGH: Cover bot logic with unittests: it is already complex enough
Copy link
Contributor Author

@dmugtasimov dmugtasimov Aug 8, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Saving time on not having unittests is overrated

from thenewboston.general.commands import CustomCommand
from thenewboston.general.utils.json import ResponseEncoder


class Command(CustomCommand):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Was very handy for developing and debugging Anthropic integration

@dmugtasimov dmugtasimov merged commit 17a9873 into master Aug 9, 2024
1 check passed
Copy link

sentry-io bot commented Aug 9, 2024

Suspect Issues

This pull request was deployed and Sentry observed the following issues:

  • ‼️ Exception: PromptLayer had the following error while getting your prompt template: tasks.sync_contributions View Issue

Did you find this useful? React with a 👍 or 👎

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants