Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add magpie support llama cpp ollama #1084

Closed

Conversation

davidberenstein1957
Copy link
Member

@davidberenstein1957 davidberenstein1957 commented Dec 19, 2024

I added extended support for Ollama and Llamacpp.

  • ollama support
  • llamacpp support
  • minor refactors w.r.t. import of disitlabel.models module.

I looked into adding OpenAI API format for other providers but this does not work because tokenization is handled server-side and cannot be disabled out of the box.

Perhaps we can refactor the HF InferenceClient a bit to make this work.

Ollama

from distilabel.models import LlamaCppLLM
from distilabel.steps.tasks import Magpie

llm = LlamaCppLLM(
    model_path="smollm2-360m-instruct-q8_0.gguf",
    tokenizer_id="HuggingFaceTB/SmolLM2-360M-Instruct",
    magpie_pre_query_template="qwen2",
)
magpie = Magpie(
    llm=llm,
)
magpie.load()

print(next(magpie.process(inputs=[{"system": "You are a helpful assistant."}])))

Llamacpp

from distilabel.models import OllamaLLM
from distilabel.steps.tasks import Magpie

llm = OllamaLLM(
    model="llama3.1",
    tokenizer_id="meta-llama/Meta-Llama-3-8B-Instruct",
    magpie_pre_query_template="llama3",
)
magpie = Magpie(llm=llm)
magpie.load()

print(next(magpie.process(inputs=[{"system_prompt": "You are a helpful assistant."}])))

Copy link

Documentation for this PR has been built. You can view it at: https://distilabel.argilla.io/pr-1084/

Copy link

codspeed-hq bot commented Dec 19, 2024

CodSpeed Performance Report

Merging #1084 will not alter performance

Comparing feat/add/magpie-support-llama-cpp-ollama- (4e291e7) with develop (f1f7d77)

Summary

✅ 1 untouched benchmarks

@davidberenstein1957 davidberenstein1957 marked this pull request as ready for review December 23, 2024 09:30
@davidberenstein1957 davidberenstein1957 requested review from gabrielmbmb and plaguss and removed request for gabrielmbmb December 23, 2024 09:31
@davidberenstein1957 davidberenstein1957 deleted the feat/add/magpie-support-llama-cpp-ollama- branch December 23, 2024 12:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant