- Change keyword
USE
toMODEL VARIANT
- Allow to specify exact model eg.
MODEL NAME gpt-4-1106-preview
- Add postprocessing function
trimEndOfCodeBlock
replaceParameters
works with inlined JSONs
- You are able to send markdown code block in prompts (without traces of escaping)
- Postprocessing function
trimEndOfCodeBlock
is not working with escaped code blocks JUST with markdown code blocks - Rename
extractBlocksFromMarkdown
toextractAllBlocksFromMarkdown
- Add postprocessing function
trimCodeBlock
- Add
EXPECT
command to promptbooks - Add
ExecutionReport
- Add
parseNumber
utility function PtbkExecutor
returns richer result and does not throw, just returnsisSuccessful=false
, You can useassertsExecutionSuccessful
utility function to check if the execution was successful- Add
assertsExecutionSuccessful
utility function
CreatePtbkExecutorSettings
are not mandatory anymore
- Add
EXPECT JSON
command to promptbooks - Split internal representation
EXPECT
intoEXPECT_AMOUNT
andEXPECT_FORMAT
Moving logic from promptbookStringToJson
to createPtbkExecutor
- Allow postprocessing and expectations in all execution types
- Postprocessing is happening before checking expectations
- In
PromptbookJson
postprocessing is represented internally in eachPromptTemplateJson
not as separatePromptTemplateJson
- Introduce
ExpectError
- Rename
maxNaturalExecutionAttempts
tomaxExecutionAttempts
(because now it is not just for natural execution) - If title in promptbook contains emojis, pass it innto report
- Fix
description
in report - Asking user infinite times for input if the input not matches the expectations
Better execution report in markdown format
- Add
JOKER {foo}
as a way how to skip part of the promptbook - Split
UserInterfaceToolsPromptDialogOptions.prompt
intopromptTitle
andpromptMessage
- Add
UserInterfaceToolsPromptDialogOptions.priority
- Add timing information to report
- Maximum must be higher than minimum in
EXPECT
statement - Maximum 0 is not valid, should be at least 1 in
EXPECT
statement
- Allow to use custom postprocessing functions
- Allow async postprocessing functions
- Remove
Promptbook
(just using JSONPromptbookJson
format)CreatePtbkExecutorOptions
hasPromptbookJson
- Promptbooks are executed in parallel
PromptTemplateJson
containsdependentParameterNames
validatePromptbookJson
is checking for circular dependencies- Test that joker is one of the dependent parameters
Better execution reports
- Filter out voids in
executionReportJsonToString
- Add timing information to
ExecutionReportJson
(In both text and chart format) - Add money cost information to
ExecutionReportJson
(In both text and chart format) - Escape code blocks in markdown
- Do not export
replaceParameters
utility function
Export less functions from @promptbook/utils
Iterating over parameters
Parameters can be bothstring
andArray<string>
Array<string>
will itterate over all valuesYou can use postprocessing functions orEXECUTE SCRIPT
to split string into array and vice versa
- Do not remove emojis or formatting from task title in progress
- You can use
prettifyMarkdown
for postprocessing
- Add Mermaid graph to sample promptbooks
- Fix spelling errors in OpenAI error messages
Cleanup and renaming
- Cleanup the project
- Do not export unused types from
@promptbook/types
- Rename "Prompt template pipelines" to more meaningful "Promptbooks"
- Remove
DEFAULT_MODEL_REQUIREMENTS
- You need to explicitly specify the requirements - Rename
PromptTemplatePipelineLibrary
->PromptbookLibrary
- Rename
RemoteServerOptions
.ptbkLibrary
->library
- Add
RemoteServerOptions
.ptbkNames
- Rename
RemoteServerOptions
.getPtp
->getPtbkByName
- Do not use shortcut "Ptbk" but full "Promptbook" name in the code, classes, methods, etc.
- Change command
PTBK_URL
toURL
(but keep backward compatibility and preserve aliasPTBK
) - Change command
PTBK_NAME
toPROMPTBOOK_NAME
(but keep backward compatibility and preserve aliasPTBK
) - Rename
runRemoteServer
->startRemoteServer
and returnDestroyable
object
Explicit output parameters
- Every promptbook has to have
OUTPUT PARAMETER
property in header
Remove "I" prefix from interfaces and change interfaces to types.
- Rename
IAutomaticTranslator
->AutomaticTranslator
- Rename
ITranslatorOptions
->TranslatorOptions
- Rename
IGoogleAutomaticTranslatorOptions
->GoogleAutomaticTranslatorOptions
- Rename
ILindatAutomaticTranslatorOptions
->LindatAutomaticTranslatorOptions
- Remove unused
IPersonProfile
- Remove unused
ILicense
- Remove unused
IRepository
Note: Keeping "I" prefix in internal tooling like IEntity
, IExecCommandOptions
, IExecCommandOptions
Note: Also keeping stuff imported from external libraries like IDestroyable
Working on Promptbook Library. Identify promptbooks by URL.
- Change
PromptbookLibrary
class to interface - Add
SimplePromptbookLibrary
class which implementsPromptbookLibrary
- Rename
PromptbookLibrary.promptbookNames
toPromptbookLibrary.pipelineUrls
- Remove
PromptbookLibrary.createExecutor
to separate responsibility - Make more renamings and reorganizations in
PromptbookLibrary
- Make
PromptbookLibrary.listPipelines
async method - Make
PromptbookLibrary.getPipelineByUrl
async method
Multiple factories for PromptbookLibrary
, Custom errors, enhance templating
- Throwing
NotFoundError
- Throwing
PromptbookSyntaxError
- Throwing
PromptbookLogicError
- Throwing
PromptbookExecutionError
- Throwing
PromptbookReferenceError
- Throwing
UnexepctedError
- Preserve col-chars in multi-line templates, See more in
replaceParameters
unit test - Change static methods of
PromptbookLibrary
to standalone functions - Static method
createPromptbookLibraryFromSources
receives spreaded argumentsArray
instead ofRecord
- Add factory function
createPromptbookLibraryFromPromise
More options to create PromptbookLibrary
- Utility
createPromptbookLibraryFromDirectory
- Utility
createPromptbookLibraryFromUrl
- Add
extractBlock
to build-in functions - Remove problematic usage of
chalk
and usecolors
instead - Export
replaceParameters
from@promptbook/utils
Better logo and branding of Promptbook.
CLI utils exported from @promptbook/cli
After install you can use promptbook
command in terminal:
npm i @promptbook/utils
npx ptbk prettify 'promptbook/**/*.ptbk.md'
- Lower bundle size
- Normalization library
n12
is not used and all its functions are bringed to@promptbook/utils
- Better error names
- Better error used
- Make
ExpectError
private @promptbook/core
is not be peer dependency of@promptbook/utils
- Rename
expectAmount
in json toexpectations
- Expectations are passed into prompt object and used in natural tools
- Add
MockedFackedLlmExecutionTools
- Add utils
checkExpectations
andisPassingExpectations
- Better error messages from
JavascriptEvalExecutionTools
- Each exported NPM package has full README
spaceTrim
is re-exported from@promptbook/utils
More direct usage of OpenAI API, Refactoring
- Pass directly Open AI otpions to
OpenAiExecutionTools
- Change
openAiApiKey
->apiKey
when creating newOpenAiExecutionTools
- Change
- Change all import statements to import type when importing just types
Reorganize packages
💡 Now you can just install
promptbook
orptbk
as alias for everything
- New package
promptbook
as a link to all other packages - New package
ptbk
as an alias topromptbook
- New package
@promptbook/fake-llm
- Move there
MockedEchoLlmExecutionTools
andMockedFackedLlmExecutionTools
from@promptbook/core
- Move there
- New package
@promptbook/langtail
to prepare for Langtail integration
Tools refactoring
- Rename "natural" -> "llm"
- Allow to pass multiple
llm
intoExecutionTools
container - Export
renderPromptbookMermaid
through@promptbook/utils
Better utilities (for Promptbase app)
- Add reverse utility the
promptbookJsonToString
- Allow to put link callback into
renderPromptbookMermaid
- Better prompt template identification
- Add function
titleToName
exported from@promptbook/utils
- Add function
renameParameter
exported from@promptbook/utils
- Rename "Script Template" to just "Script"
Was accidentally released as earlier, re-released fully completed as 0.51.0
Add new OpenaAI models gpt-4o
and gpt-4o-2024-05-13
- Add model
gpt-4o
- Add model
gpt-4o-2024-05-13
- Classes that implements
LlmExecutionTools
must expose compatible models - List OpenAI models dynamically
- All GPT models have pricing information
- Export
OPENAI_MODELS
from@promptbook/openai
- Export types
LlmTemplateJson
,SimpleTemplateJson
,ScriptJson
,PromptDialogJson
,Expectations
from@promptbook/types
ModelRequirements.modelName
is not required anymorePromptbookExecutor
does not requireonProgress
anymoreExecutionTools
does not requireuserInterface
anymore, when not set, the user interface is disabled and promptbook which requires user interaction will fail- Export
extractParameters
,extractVariables
andextractParametersFromPromptTemplate
from@promptbook/utils
- Add and export set operations
difference
,intersection
andunion
from@promptbook/utils
- Export
POSTPROCESSING_FUNCTIONS
from@promptbook/execute-javascript
- No need to specify MODEL VARIANT and MODEL NAME in .ptbk.md explicitly, CHAT VARIANT will be used as default
Add support for Claude \ Anthropic models via package @promptbook/anthropic-claude
and add Azure OpenAI models via package @promptbook/azure-openai
- Export
MultipleLlmExecutionTools
from@promptbook/core
- Always use "modelName" not just "model"
- Standartization of model providers
- Delete
@promptbook/wizzard
- Move
assertsExecutionSuccessful
,checkExpectations
,executionReportJsonToString
,ExecutionReportStringOptions
,ExecutionReportStringOptionsDefaults
,isPassingExpectations
,prettifyPromptbookString
from@promptbook/utils
to@promptbook/core
- Make and use
JavascriptExecutionTools
as placeholder for better implementation with propper sandboxing - Implement
createPromptbookLibraryFromDirectory
export from@promptbook/core
- Make
PromptbookLibraryError
- Check Promptbook URL uniqueness in
SimplePromptbookLibrary
(see [🦄]) - Util
createPromptbookLibraryFromPromise
is not public anymore - Util
forEachAsync
export from@promptbook/utils
Repair and organize imports
- Custom errors
ExpectError
,NotFoundError
,PromptbookExecutionError
,PromptbookLogicError
,PromptbookLibraryError
,PromptbookSyntaxError
exported from@promptbook/core
Better usage computation and shape
- Change shape of
PromptResult.usage
- Remove types
number_positive_or_zero
andnumber_negative_or_zero
- Export type
PromptResultUsage
,PromptResultUsageCounts
andUncertainNumber
from@promptbook/types
- Export util
addUsage
from@promptbook/core
- Put usage directly in result of each execution
- Export function
usageToWorktime
from@promptbook/core
Rename and reorganize libraries
- Take
createPromptbookLibraryFromDirectory
from@promptbook/core
->@promptbook/node
(to avoid dependency risk errors) - Rename
@promptbook/fake-llmed
->@promptbook/fake-llm
- Export
PROMPTBOOK_ENGINE_VERSION
from each package - Use
export type
in@promptbook/types
Better JSON Mode
OpenAiExecutionTools
will use JSON mode nativelyOpenAiExecutionTools
Do not fail on empty (but valid string) responses
- Internal reorganization of folders and files
- Export types as type export
Preparation for system for management of external knowledge (RAG), vector embeddings and propper building of pipeline collection.
- Add
MaterialKnowledgePieceJson
- Add
KnowledgeJson
- Add
prepareKnowledgeFromMarkdown
exported from@promptbook/core
- Change
promptbookStringToJson
to async function (and addpromptbookStringToJsonSync
for promptbooks without external knowledge) - Change
createPromptbookLibraryFromSources
tocreatePromptbookLibraryFromJson
and allow only compiled jsons as input + it is notasync
anymore - Allow only jsons as input in
createLibraryFromPromise
- Class
SimplePromptbookLibrary
not exposed at all, only typePromptbookLibrary
and constructors - Rename all
createPromptbookLibraryFromXyz
tocreateLibraryFromXyz
- Misc Tool classes not requires options anymore (like
CallbackInterfaceTools
,OpenAiExecutionTools
,AnthropicClaudeExecutionTools
, etc.) - Add util
libraryToJson
exported from@promptbook/core
- CLI util
ptbk make ...
can convert promptbooks to JSON promptbookStringToJson
automatically looks forpromptbook-collection.json
in root of given directory- Rename
validatePromptbookJson
tovalidatePromptbook
- Create
embed
method on LLM tools,PromptEmbeddingResult
,EmbeddingVector
andembeddingVectorToString
createLibraryFromDirectory
still DONT use prebuild library (just detects it)
Renaming and making names more consistent and less disambigous
- Rename word "promptbook"
- Keep name "Promptbook" as name for this project.
- Rename promptbook as pipeline of templates defined in
.ptbk.md
to "pipeline"
- Rename word "library"
- For library used as a collection of templates use name "collection"
- For library used as this project and package use word "package"
- Rename methods in
LlmExecutionTools
gptChat
->callChatModel
gptComplete
->callCompletionModel
- Rename custom errors
- Rename folder
promptbook-collection
->promptbook-collection
- In CLI you ca use both
promptbook
andptbk
Big syntax additions Working external knowledge, personas, preparation for instruments and actions
- Add reserved parameter names
- Add
SAMPLE
command with notation for parameter samples to.ptbk.md
files - Add
KNOWLEDGE
command to.ptbk.md
files - Change
EXECUTE
command toBLOCK
command - Change
executionType
->templateType
- Rename
SynraxError
toParsingError
- Rename
extractParameters
toextractParameterNames
- Rename
ExecutionError
toPipelineExecutionError
- Remove
TemplateError
and replace withExecutionError
- Allow deep structure (h3, h4,...) in
.ptbk.md
files - Add
callEmbeddingModel
toLlmExecutionTools
callChatModel
andcallCompletionModel
are not required to be implemented inLlmExecutionTools
anymore- Remove
MultipleLlmExecutionTools
and makejoinLlmExecutionTools
function - You can pass simple array of
LlmExecutionTools
intoExecutionTools
and it will be joined automatically viajoinLlmExecutionTools
- Remove the
MarkdownStructure
and replace by simpler solutionflattenMarkdown
+splitMarkdownIntoSections
+parseMarkdownSection
which works just with markdown strings and export from@promptbook/utils
<- [🕞] - Markdown utils are exported through
@promptbook/markdown-utils
(and removed from@promptbook/utils
) - String normalizers goes alongside with types; for example
normalizeTo_SCREAMING_CASE
->string_SCREAMING_CASE
- Export
isValidUrl
,isValidPipelineUrl
,isValidFilePath
,isValidJavascriptName
,isValidSemanticVersion
,isHostnameOnPrivateNetwork
,isUrlOnPrivateNetwork
andisValidUuid
from@promptbook/utils
- Add
systemMessage
,temperature
andseed
toModelRequirements
- Code blocks can be noteted both by ``` and >
- Add caching and storage
- Export utity
stringifyPipelineJson
to stringifyPipelineJson
with pretty formatting of loooooong knowledge indexes from@promptbook/core
[🎐] Better work with usage
- Add usage to preparations and reports
- Export function
usageToHuman
from@promptbook/core
- Rename
TotalCost
toTotalUsage
- Allow to reload cache
- Fix error in
uncertainNumber
which always returned "uncertain 0" - [🐞] Fix usage counting in
OpenAiExecutionTools
Better system for imports, exports and dependencies
- Manage package exports automatically
- Automatically export all types from
@promptbook/types
- Protext runtime-specific code - for example protect browser-specific to never reach
@promptbook/node
- Consiese README - move things to discussions
- Make Partial and optional
Was accidentally released skipped
[🍜] Anonymous server
- Anonymous server
LlmConfiguration
andcreateLlmToolsFromConfiguration
- Better names for knowledge sources
- Rename keys inside prepared knowledge
- Use
MultipleLlmExecutionTools
more - LLM tools providers have constructor functions, for example
OpenAiExecutionTools
->createOpenAiExecutionTools
remoteUrl
isstring_base_url
[🎰] Model updates and registers
- Prefix all non-pure by
$
- Add model
claude-3-5-sonnet-20240620
toAnthropicClaudeExecutionTools
- [🐞] Fix usage counting in
AnthropicClaudeExecutionTools
- Update
@anthropic-ai/sdk
from0.21.1
to0.26.1
- Update
@azure/openai
from1.0.0-beta.12
to2.0.0-beta.1
- Update
openai
from4.46.1
to4.55.9
- Add
LlmExecutionToolsConstructor
- Add
$llmToolsConfigurationBoilerplatesRegister
- Add
$llmToolsRegister
- Rename
Openai
->OpenAi
[🚉] Types and interfaces, JSON serialization
- Enhance 🤍 The Promptbook Whitepaper
- Enhance the
README.md
ExecutionReportJson
is fully serializable as JSON- [🛫]
Prompt
is fully serializable as JSON - Add type
string_postprocessing_function_name
- Add
isSerializableAsJson
utility function, use it to protect inputs and check outputs and export from@promptbook/utils
- Add
serializeError
anddeserializeError
utility functions and export from@promptbook/utils
- Rename
ReferenceError
toPipelineUrlError
- Make index of all errors and export from
@promptbook/core
- Mark all entities that are fully serializable as JSON by
[🚉]
- When running in browser, auto add
dangerouslyAllowBrowser
fromcreateOpenAiExecutionTools
RemoteLlmExecutionTools
automatically retries on error- Rename
client_id
->string_user_id
andclientId
->userId
[🍧] Commands and command parser
- There are 2 different commands,
EXPECT
andFORMAT
- Rename
BLOCK
command ->TEMPLATE
EXPECT JSON
changed toFORMAT JSON
- Change
usagePlaces
->isUsedInPipelineHead
+isUsedInPipelineTemplate
- All parsers have functions
$applyToPipelineJson
,$applyToTemplateJson
,stringify
,takeFromPipelineJson
andtakeFromTemplateJson
PipelineJson
hasdefaultModelRequirements
PipelineJson
has Chat model variant as default without need to specify it explicitly- [🥜] Rename "Prompt template" -> "Template"
- Rename
PromptTemplateJson
->TemplateJson
- Rename
extractParameterNamesFromPromptTemplate
->extractParameterNamesFromTemplate
- Rename
PromptTemplateJsonCommon
->TemplateJsonCommon
- Rename
PromptTemplateParameterJson
->ParameterJson
- Rename
PipelineJson.promptTemplates
->PipelineJson.templates
- Rename
PromptDialogJson
->DialogTemplateJson
- Rename
PROMPT_DIALOG
->DIALOG_TEMPLATE
- Rename
ScriptJson
->ScriptTemplateJson
- Rename
SCRIPT
->SCRIPT_TEMPLATE
- Rename
LlmTemplateJson
->PromptTemplateJson
- Rename
ParsingError
->ParseError
Command FOREACH
- Allow iterations with
FOREACH
command - Paremeter names are case insensitive and normalized
- Big refactoring of
createPipelineExecutor
- Enhance and implement formats
FormatDefinition
- Allow to parse CSVs via
CsvFormatDefinition
- Change
ListFormatDefinition
->TextFormatDefinition
Support for local models - integrate Ollama
- Make new package
@promptbook/ollama
- Add
OllamaExecutionTools
exported from@promptbook/ollama
Knowledge scrapers [🐝]
- Make new package
@promptbook/pdf
- Make new package
@promptbook/documents
- Make new package
@promptbook/legacy-documents
- Make new package
@promptbook/website-crawler
- Remove llm tools from
PrepareAndScrapeOptions
and add second arcument to misc preparation functions - Allow to import markdown files with knowledge
- Allow to import
.docx
files with knowledge.docx
-(Pandoc)->.md
- Allow to import
.doc
files with knowledge.doc
-(LibreOffice)->.docx
-(Pandoc)->.md
- Allow to import
.rtf
files with knowledge.rtf
-(LibreOffice)->.docx
-(Pandoc)->.md
- Allow to import websites with knowledge
- Add new error
KnowledgeScrapeError
- Filesystem is passed as dependency
- External programs are passed as dependency
- Remove
PipelineStringToJsonOptions
in favour ofPrepareAndScrapeOptions
- Add
MissingToolsError
- Change
FileStorage
->FileCacheStorage
- Changed behavior of
titleToName
when passing URLs or file paths - Fix normalize functions when normalizing string containing slash char "/", ""
- Pass
fs
throughExecutionTools
- Pass
executables
throughExecutionTools
- Pass
scrapers
throughExecutionTools
- Add utilities
$provideExecutionToolsForBrowser
and$provideExecutionToolsForNode
and use them in samples - Add utilities
$provideScrapersForBrowser
and$provideScrapersForNode
- Rename
createLlmToolsFromConfigurationFromEnv
->$provideLlmToolsConfigurationFromEnv
andcreateLlmToolsFromEnv
->$provideLlmToolsFromEnv
- Rename
getLlmToolsForTestingAndScriptsAndPlayground
->$provideLlmToolsForTestingAndScriptsAndPlayground
- Rename
getLlmToolsForCli
->$provideLlmToolsForCli
- Change most
Array
->ReadonlyArray
- Unite
CreatePipelineExecutorOptions
andCreatePipelineExecutorSettings
- Change
--reload-cache
to--reload
in CLI - Prefix default values with
DEFAULT_
Support for Assistants API (GPTs) from OpenAI
- Add
OpenAiAssistantExecutionTools
OpenAiExecutionTools.createAssistantSubtools
- Add
UNCERTAIN_USAGE
- LLM Tools
getClient
method are public - LLM Tools
options
are notprivate
anymore butprotected
getClient
methods are public- In remote server allow to pass not only
userId
but alsoappId
andcustomOptions
- In remote server
userId
can not beundefined
anymore butnull
OpenAiExecutionTools
recievesuserId
(notuser
)- Change Collection mode -> Application mode
- Split Promptbook framework and Book language
- Rename "sample" -> "example"
- Proposal for version
1.0.0
both in Promptbook and Book language - Allow to run books directly in cli via
ptbk run ./path/to/book.ptbk.md
- Fix security warnings in dependencies
- Enhance
countLines
andcountPages
utility function - No need to explicitly define the input and output parameters
- Allow empty pipelines
- Add
BlackholeStorage
- Rename
.ptbk.*
->.book.*
- Split
PROMPTBOOK_VERSION
->BOOK_LANGUAGE_VERSION
+PROMPTBOOK_ENGINE_VERSION
- Finish split between Promptbook framework and Book language
Formfactors, Rebranding
- Add
FormfactorCommand
- Add Pipeline interfaces
- Split
ParameterJson
intoInputParameterJson
,OutputParameterJson
andIntermediateParameterJson
- Reorganize
/src
folder - Rename
Template
->Task
- Rename
TemplateCommand
->SectionCommand
command - Make alongside
SectionType
theTaskType
- 🤍 Change Whitepaper to Abstract
- Rename default folder for your books from
promptbook-collection
->books
- Change claim of the project to "It's time for a paradigm shift! The future of software is in plain English, French or Latin."
Skipped, because of the mistake in the versioning. (It should be pre-release)
Support for more models, add @promptbook/vercel
and @promptbook/google
packages.
- @promptbook/vercel - Adapter for Vercel functionalities
- @promptbook/google - Integration with Google's Gemini API
- Option
userId
can be passed into all tools and instead ofnull
, it can beundefined
- Rename
$currentDate
->$getCurrentDate
Utility functions
- Add
removePipelineCommand
- Rename util
renameParameter
->renamePipelineParameter
- Rename util
extractVariables
->extractVariablesFromScript
- [👖] Utilities
extractParameterNamesFromTask
andrenamePipelineParameter
are not exported from@promptbook/utils
but@promptbook/core
because they are tightly interconnected with the Promptbook and cannot be used as universal utility
Implicit formfactors
- You don't need to specify the formfactor or input+output params explicitly. Implementing the formfactor interface is sufficient.
- Fix in deep cloning of arrays
Simple chat notation
- High-level chat notation
- High-level abstractions
- Introduction of
compilePipeline
- Add utility
orderJson
exported from@promptbook/utils
- Add utility
exportJson
exported from@promptbook/utils
(in previous versions this util was private and known as$asDeeplyFrozenSerializableJson
) - Circular objects with same family references are considered NOT serializable
- Interactive mode for
FORMFACTOR CHATBOT
in CLI - Deprecate
pipelineJsonToString
- Deprecate
unpreparePipeline
- Rename
pipelineStringToJson
->compilePipeline
- Rename
pipelineStringToJsonSync
->precompilePipeline
Editing, templates and flat pipelines
- Backup original book as
sources
inPipelineJson
fetch
is passed throughExecutionTools
to allow proxying in browser- Make new package
@promptbook/editable
and move misc editing tools there - Make new package
@promptbook/templates
and add functiongetBookTemplate
- Rename
replaceParameters
->templateParameters
- Add
valueToString
andnumberToString
utility function - Allow
boolean
,number
,null
,undefined
and fulljson
parameters intemplateParameters
(alongside withstring
) - Change
--output
to--output
in CLIptbk make
- Re-introduction of package
@promptbook/wizzard
- Allow flat pipelines
- Root URL for flat pipelines
- Change
$provideLlmToolsForCli
->$provideLlmToolsForWizzardOrCli
- Do not require
.book.md
in pipeline url - More file paths are considered as valid
- Walk to the root of the project and find the nearest
.env
file $provideLlmToolsConfigurationFromEnv
,$provideLlmToolsFromEnv
,$provideLlmToolsForWizzardOrCli
,$provideLlmToolsForTestingAndScriptsAndPlayground
are asyncGENERATOR
andIMAGE_GENERATOR
formfactors- Rename
removeContentComments
->removeMarkdownComments
- Rename
DEFAULT_TITLE
->DEFAULT_BOOK_TITLE
- Rename
precompilePipeline
->parsePipeline
Compile via remote server
- Add
compilePipelineOnRemoteServer
to package@promptbook/remote-client
- Add
preparePipelineOnRemoteServer
to package@promptbook/remote-client
- Changes in remote server that are not backward compatible
- Add
DEFAULT_TASK_TITLE
- Enforce LF (\n) lines
@promptbook/editable
and integration of markitdown
- Integrate
markitdown
and export through@promptbook/markitdown
- Export parsing internals to
@promptbook/editable
- Rename
sourceContent
->knowledgeSourceContent
- Multiple functions to manipulate with
PipelineString
book
notation supports values interpolation- Make equivalent of
book
notation theprompt
exported through@promptbook/utils
- Flat books does not expect return parameter
- Wizzard always returns simple
result: string
key in output - Using
BUSL-1.1
license (only for@promptbook/utils
keep usingCC-BY-4.0
) - Support of DeepSeek models
- Support of
o3-mini
model by OpenAI - Change admin email to
[email protected]
[🐚] Server queue and tasks
- Publishing Promptbook into Docker Hub
- Remote server run in both
REST
andSocket.io
mode - Remote server can run entire books not just single prompt tasks (for now just in REST mode)
- In future remote server will support callbacks / pingbacks
- Remote server has internal task queue
- Remote server can be started via
ptbk start-server
- Hide
$randomSeed
- Remove
TaskProgress
- Remove
assertsExecutionSuccessful
PipelineExecutor
: ChangeonProgress
->ExecutionTask
- Remote server allows to set
rootPath
- Remote server can run in
Docker
- In future remote server persists its queue in
SQLite
/.promptbook
/Neo4j
- Do not generate stats for pre-releases to speed up the build process
- Allow pipeline URLs on private and unsecured networks
Use .book
as default extension for books
- Rename
.book.md
->.book.md
- Rename
.book.json
->.bookc
- Establish
.bookc
format - compiled book json in zip archive - Use VCCode extension
Promptbook .book language support
- Fix: Version published to Docker Hub does not lag behind the NPM version
createLibraryFromDirectory
uses prebuild library
Better expectation format in PromptbookJson
Allow to split parameters into multiple values and iterate over them
- Allow to specify model creativity eg.
MODEL CREATIVITY EXTREME
Better script execution
- Gettings rid of
JavascriptEvalExecutionTools
and implement propper isolated script execution inJavascriptExecutionTools
- List all default postprocessing functions in
@promptbook/utils
README - Implement
PythonExecutionTools
for executing Python scripts
More options to create PromptbookLibrary
Intagration with Langtail
-
TODO: Add splitInto functions to
@promptbook/utils
besides all thecount
functions -
Add
countCharacters
->splitIntoCharacters
-
Add
countLines
->splitIntoLines
-
Add
countPages
->splitIntoPages
-
Add
countParagraphs
->splitIntoParagraphs
-
Add
countSentences
->splitIntoSentences
-
Add
CountUtils
->splitIntoUtils
-
Add
countWords
->splitIntoWords
More expect variations
- Add command
EXPECT "..."
<- [🥤] - Add command
EXPECT /.../i
<- [🥤] - Add command
EXPECT "...{foo}..."
<- [🥤] - Add command
EXPECT /...{foo}.../i
<- [🥤] - Add command
EXPECT JSON ARRAY
andEXPECT JSON OBJECT
(In future this will be suggar code forEXPECT JSON SCHEMA
) <- [🥤]
- When postprocessing fails, retry in same way as failed expectations
- When making next attempt for
DIALOG BLOCK
, preserve the previous user input <- [🌹]
Across the repository there are marked [🍓] places that are required to be done before 1.0.0
release