Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update model name in these for making them eligible for multimodality #5777

Merged
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ import { ChatVertexAI } from "@langchain/google-vertexai";

const model = new ChatVertexAI({
temperature: 0.7,
model: "gemini-1.5-pro-001",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

While 1.5 Pro is fine, 1.5 Flash might be a better general purpose suggestion for the examples (since it is significantly cheaper).

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You realize if you start adding models to this that your going to have to update it every time one gets deprecated.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fair point - however we can't update the class default without changing behavior so we generally put it explicitly. New models are infrequent enough where bumping all the strings isn't a huge issue.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we have a different definition of infrequent I would be willing to bet your looking at ever few months.

});
const stream = await model.stream([
["system", "You are a funny assistant that answers in pirate language."],
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ const geminiCalculatorTool: GeminiTool = {

const model = new ChatVertexAI({
temperature: 0.7,
model: "gemini-1.0-pro",
model: "gemini-pro-vision",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggest "gemini-1.5-flash" or "gemini-1.5-flash-001" instead. (Pro Vision has been deprecated on the AI Studio side and will be supported for less time than 1.5 flash on the Vertex AI side.)

}).bind({
tools: [geminiCalculatorTool],
});
Expand Down
2 changes: 1 addition & 1 deletion examples/src/models/chat/integration_googlevertexai-wsa.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ const calculatorSchema = z.object({

const model = new ChatVertexAI({
temperature: 0.7,
model: "gemini-1.0-pro",
model: "gemini-pro-vision",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggest "gemini-1.5-flash" or "gemini-1.5-flash-001" instead. (Pro Vision has been deprecated on the AI Studio side and will be supported for less time than 1.5 flash on the Vertex AI side.)

}).withStructuredOutput(calculatorSchema);

const response = await model.invoke("What is 1628253239 times 81623836?");
Expand Down
2 changes: 1 addition & 1 deletion examples/src/models/chat/integration_googlevertexai.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ import { ChatVertexAI } from "@langchain/google-vertexai";

const model = new ChatVertexAI({
temperature: 0.7,
model: "gemini-1.0-pro",
model: "gemini-pro-vision",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggest "gemini-1.5-flash" or "gemini-1.5-flash-001" instead. (Pro Vision has been deprecated on the AI Studio side and will be supported for less time than 1.5 flash on the Vertex AI side.)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, Allen, i think you are right. Thank you, will update it

});

const response = await model.invoke("Why is the ocean blue?");
Expand Down