You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What is the bug?
A clear and concise description of the bug.
In the current claude v3 blueprint, it does not include the system field in the response_body in the connector, thus the system prompt cannot be pass to the model or it will only use the default prompt and cannot be modify for specific use case.
The change needs to be similar to the following
POST /_plugins/_ml/connectors/_create
{
"name": "Claude V3",
"description": "Connector for Claude V3",
"version": 1,
"protocol": "aws_sigv4",
"parameters": {
"region": "us-west-2",
"service_name": "bedrock",
"auth": "Sig_V4",
"response_filter": "$.content[0].text",
"max_tokens_to_sample": "8000",
"anthropic_version": "bedrock-2023-05-31",
"model": "anthropic.claude-3-sonnet-20240229-v1:0",
"users":"user"
},
"credential": {
"access_key": "<access_key>",
"secret_key": "<secret_key>",
"session_token": "<session_token>"
},
"actions": [
{
"action_type": "PREDICT",
"method": "POST",
"url": "https://bedrock-runtime.us-west-2.amazonaws.com/model/anthropic.claude-instant-v1/invoke",
"headers": {
"x-amz-content-sha256": "required",
"content-type": "application/json"
},
"request_body": "{\"messages\":[{\"role\":\"${parameters.users}\",\"content\":[{\"type\":\"text\",\"text\":\"${parameters.inputs}\"}]}],\"anthropic_version\":\"${parameters.anthropic_version}\",\"max_tokens\":${parameters.max_tokens_to_sample},\"system\":\"${parameters.system:-null}\"}"
}
]
}
POST /_plugins/_ml/models/_register
{
"name": "Claude V3 model",
"version": "1.0.1",
"function_name": "remote",
"description": "Claude V3",
"connector_id": "bakn-ZIBAs32TwoKulpD"
}
POST /_plugins/_ml/models/cqkn-ZIBAs32TwoK11ql/_deploy
POST /_plugins/_ml/models/cqkn-ZIBAs32TwoK11ql/_predict
{
"parameters": {
"inputs": "How many moons does Jupiter have?",
"system": "You are an ${parameters.role}, tell me about ${parameters.inputs}, Ensure tha you can generate a short answer less than 90 words.",
"role":"assistant"}
}
##returnning
{
"inference_results": [
{
"output": [
{
"name": "response",
"dataAsMap": {
"response": "Jupiter has 79 known moons. The four largest moons of Jupiter that were discovered by Galileo Galilei in 1610 are Io, Europa, Ganymede, and Callisto. Io is the innermost of the four and volcanically active due to tidal heating from gravitational tug-of-war with Jupiter and the other large moons. Europa's icy surface likely hides an ocean of liquid water beneath. Ganymede is the largest moon in the Solar System. Callisto is also thought to harbor a subsurface ocean. Many of Jupiter's other moons are much smaller and more irregularly shaped. Several were discovered during the past few decades using ground- and space-based telescopes."
}
}
],
"status_code": 200
}
]
}
POST /_plugins/_ml/models/cqkn-ZIBAs32TwoK11ql/_predict
{
"parameters": {
"inputs": "How many moons does Jupiter have?",
"system": "You are an ${parameters.role}, tell me about ${parameters.inputs}, Ensure tha you can generate a short answer less than 10 words.",
"role":"assistant"}
##returnning
{
"inference_results": [
{
"output": [
{
"name": "response",
"dataAsMap": {
"response": "79 moons."
}
}
],
"status_code": 200
}
]
}
Do you have any screenshots?
If applicable, add screenshots to help explain your problem.
Do you have any additional context?
Add any other context about the problem.
The text was updated successfully, but these errors were encountered:
What is the bug?
A clear and concise description of the bug.
In the current claude v3 blueprint, it does not include the
system
field in the response_body in the connector, thus the system prompt cannot be pass to the model or it will only use the default prompt and cannot be modify for specific use case.The change needs to be similar to the following
Do you have any screenshots?
If applicable, add screenshots to help explain your problem.
Do you have any additional context?
Add any other context about the problem.
The text was updated successfully, but these errors were encountered: