Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added New LLM Cohere #81

Merged
merged 4 commits into from
Oct 29, 2024
Merged

Conversation

madhavi-peddireddy
Copy link
Contributor

Screenshot (133)
Screenshot (134)

src/beyondllm/llms/cohere.py Show resolved Hide resolved
src/beyondllm/llms/cohere.py Outdated Show resolved Hide resolved
@madhavi-peddireddy
Copy link
Contributor Author

Removed

@tarun-aiplanet
Copy link
Member

tarun-aiplanet commented Oct 28, 2024

Please add the model name inside llms init.py file

from typing import Any, Dict
from dataclasses import dataclass, field
import os
import cohere
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

import the cohere module inside load_llm refer to other LLMs. This needs to be under exception block

from typing import Any, Dict
from dataclasses import dataclass, field
import os
from together import Together
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

import the together module inside load_llm refer to other LLMs. This needs to be under exception block

def predict(self, prompt: Any) -> str:
"""Generate a response from the model based on the provided prompt."""
try:
stream = self.client.chat.completions.create(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

by default, its ideal to keep stream as False. please change it as per cohere implementation you have done

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@madhavi-peddireddy
Congrats on being approved for the Hacktoberfest swags! Please share your details with me so we can send it over to you.

You can either drop the details in our Discord channel: bit.ly/aiplanet-discord or email me at [email protected].

@tarun-aiplanet tarun-aiplanet added ready-to-pull LGTM approved Changes accepted. hacktoberfest-accepted hacktoberfest accepted labels Oct 29, 2024
@tarun-aiplanet tarun-aiplanet merged commit ff04557 into aiplanethub:main Oct 29, 2024
1 check failed
@aryan-aiplanet
Copy link

@CodiumAI-Agent /review

@CodiumAI-Agent
Copy link

CodiumAI-Agent commented Dec 12, 2024

PR Reviewer Guide 🔍

(Review updated until commit 89c8062)

Here are some key observations to aid the review process:

⏱️ Estimated effort to review: 4 🔵🔵🔵🔵⚪
🧪 No relevant tests
🔒 Security concerns

Sensitive information exposure:
The api_key is stored as a class attribute and may be exposed if the object is serialized or logged. Consider implementing secure storage or obfuscation for sensitive data.

⚡ Recommended focus areas for review

Missing Dependency Handling
The load_llm method in CohereModel does not handle the case where the cohere module is not installed properly. Although a message is printed, the absence of a proper fallback or exception handling could lead to runtime issues.

Error Handling in Prediction
The predict method in CohereModel raises a generic exception when prediction fails. Consider using a more specific exception type or providing additional context to aid debugging.

Missing Dependency Handling
The load_llm method in TogetherModel does not handle the case where the together module is not installed properly. Although a message is printed, the absence of a proper fallback or exception handling could lead to runtime issues.

Error Handling in Prediction
The predict method in TogetherModel raises a generic exception when prediction fails. Consider using a more specific exception type or providing additional context to aid debugging.

Hardcoded System Message
The predict method in TogetherModel includes a hardcoded system message. Consider making this configurable to allow flexibility in the model's behavior.

@aryan-aiplanet
Copy link

@CodiumAI-Agent /review

@CodiumAI-Agent
Copy link

Persistent review updated to latest commit 89c8062

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
approved Changes accepted. hacktoberfest-accepted hacktoberfest accepted ready-to-pull LGTM
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants