Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support local LLM #16

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open

support local LLM #16

wants to merge 2 commits into from

Conversation

akiba93
Copy link

@akiba93 akiba93 commented Dec 17, 2024

Add the feature to support call local open source model to generate dataset with related model file path or HuggingFace model id

Copy link
Collaborator

@SonglinLyu SonglinLyu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

请将调用本地模型相应的文档补齐,供后续测试

@@ -87,15 +90,16 @@ def generalization(
{"role": "user", "content": content},
]
# 3. get response
responses = call_with_messages(massages)
responses = call_with_messages(massages, tokenizer, model, current_device)
print(responses)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

类似的print是测试用的吗?可以去掉

# 3. get response
responses = call_with_messages(massages)
responses = call_with_messages(messages, tokenizer, model, current_device)
print(responses)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

类似的print是测试用的吗?可以去掉

@@ -146,11 +153,11 @@ def call_with_messages(messages):
)
if response.status_code == HTTPStatus.OK:
content = response.output.choices[0].message.content
# print(content)
#print(content)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

注释可删除

#for index, question in enumerate(questions):
# file.write(cyphers[index] + "\n")
# file.write(question + "\n")
for index,cypher in enumerate(cyphers):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里的逻辑为什么发生了变化?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

因为codellama的话,会出现部分cypher为空的现象,这样以question为index,会导致cyphers[index]超限,所以改成了以cypher为index

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants