Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump llama-cpp-python from 0.2.50 to 0.2.72 in /llm #144

Merged
merged 1 commit into from
Jun 18, 2024

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Jun 18, 2024

Bumps llama-cpp-python from 0.2.50 to 0.2.72.

Changelog

Sourced from llama-cpp-python's changelog.

[0.2.72]

  • fix(security): Remote Code Execution by Server-Side Template Injection in Model Metadata by @​retr0reg in b454f40a9a1787b2b5659cd2cb00819d983185df
  • fix(security): Update remaining jinja chat templates to use immutable sandbox by @​CISC in #1441

[0.2.71]

  • feat: Update llama.cpp to ggerganov/llama.cpp@911b390
  • fix: Make leading bos_token optional for image chat formats, fix nanollava system message by @​abetlen in 77122638b4153e31d9f277b3d905c2900b536632
  • fix: free last image embed in llava chat handler by @​abetlen in 3757328b703b2cd32dcbd5853271e3a8c8599fe7

[0.2.70]

  • feat: Update llama.cpp to ggerganov/llama.cpp@c0e6fbf
  • feat: fill-in-middle support by @​CISC in #1386
  • fix: adding missing args in create_completion for functionary chat handler by @​skalade in #1430
  • docs: update README.md @​eltociear in #1432
  • fix: chat_format log where auto-detected format prints None by @​balvisio in #1434
  • feat(server): Add support for setting root_path by @​abetlen in 0318702cdc860999ee70f277425edbbfe0e60419
  • feat(ci): Add docker checks and check deps more frequently by @​Smartappli in #1426
  • fix: detokenization case where first token does not start with a leading space by @​noamgat in #1375
  • feat: Implement streaming for Functionary v2 + Bug fixes by @​jeffrey-fong in #1419
  • fix: Use memmove to copy str_value kv_override by @​abetlen in 9f7a85571ae80d3b6ddbd3e1bae407b9f1e3448a
  • feat(server): Remove temperature bounds checks for server by @​abetlen in 0a454bebe67d12a446981eb16028c168ca5faa81
  • fix(server): Propagate flash_attn to model load by @​dthuerck in #1424

[0.2.69]

  • feat: Update llama.cpp to ggerganov/llama.cpp@6ecf318
  • feat: Add llama-3-vision-alpha chat format by @​abetlen in 31b1d95a6c19f5b615a3286069f181a415f872e8
  • fix: Change default verbose value of verbose in image chat format handlers to True to match Llama by @​abetlen in 4f01c452b6c738dc56eacac3758119b12c57ea94
  • fix: Suppress all logs when verbose=False, use hardcoded fileno's to work in colab notebooks by @​abetlen in f116175a5a7c84569c88cad231855c1e6e59ff6e
  • fix: UTF-8 handling with grammars by @​jsoma in #1415

[0.2.68]

[0.2.67]

  • fix: Ensure image renders before text in chat formats regardless of message content order by @​abetlen in 3489ef09d3775f4a87fb7114f619e8ba9cb6b656
  • fix(ci): Fix bug in use of upload-artifact failing to merge multiple artifacts into a single release by @​abetlen in d03f15bb73a1d520970357b702a9e7d4cc2a7a62

[0.2.66]

  • feat: Update llama.cpp to ggerganov/llama.cpp@8843a98
  • feat: Generic Chat Formats, Tool Calling, and Huggingface Pull Support for Multimodal Models (Obsidian, LLaVA1.6, Moondream) by @​abetlen in #1147
  • ci(fix): Workflow actions updates and fix arm64 wheels not included in release by @​Smartappli in #1392

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    You can disable automated security fix PRs for this repo from the Security Alerts page.

Bumps [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) from 0.2.50 to 0.2.72.
- [Release notes](https://github.com/abetlen/llama-cpp-python/releases)
- [Changelog](https://github.com/abetlen/llama-cpp-python/blob/main/CHANGELOG.md)
- [Commits](abetlen/llama-cpp-python@v0.2.50...v0.2.72)

---
updated-dependencies:
- dependency-name: llama-cpp-python
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Jun 18, 2024
@pgronkievitz pgronkievitz requested a review from Kleczyk June 18, 2024 08:40
Copy link
Contributor

@Kleczyk Kleczyk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have hope will be working xd

@Kleczyk Kleczyk merged commit 5e2f353 into main Jun 18, 2024
2 checks passed
@Kleczyk Kleczyk deleted the dependabot/pip/llm/llama-cpp-python-0.2.72 branch June 18, 2024 20:25
@pgronkievitz
Copy link
Member

I have hope will be working xd

@Kleczyk you were supposed to check it :v

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants