-
Notifications
You must be signed in to change notification settings - Fork 13
Issues: BoundaryML/baml
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Baml Playground seemingly has tests continuously running when actually completed
#730
opened Jun 28, 2024 by
etbyrd
Offer Vertex-AI as a provider
enhancement
New feature or request
#706
opened Jun 21, 2024 by
hellovai
Show parsing state in the playground. When dropping fields for example
enhancement
New feature or request
#702
opened Jun 20, 2024 by
hellovai
[Streaming] OpenAI token usage counts are missing for streamed responses
#405
opened Feb 8, 2024 by
aaronvg
[Playground] Canceling the test run shouldnt clear the tests that did run
#383
opened Jan 29, 2024 by
aaronvg
[Playground] - Re-run failed, tests, or run only a subset of selected tests.
#380
opened Jan 28, 2024 by
aaronvg
Playground - Be able to compare different prompts with different potential output types.
#379
opened Jan 28, 2024 by
aaronvg
[Docs] Guide - how to hide generated baml_client code in github diffs
#377
opened Jan 27, 2024 by
aaronvg
Add an easy auto-completed way to build an example output in the prompt
#375
opened Jan 27, 2024 by
aaronvg
Having numbers in named args causes compilation error if you inject it in the prompt
#373
opened Jan 27, 2024 by
aaronvg
[BAML] Suggest prompt strategy or prompt engineering tips via linter
#372
opened Jan 26, 2024 by
aaronvg
Add a "2 warnings, 1 error" at the end of a prompt as well using a codelens.
#369
opened Jan 26, 2024 by
aaronvg
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.