Skip to content

Commit

Permalink
Update some Go snippets (#467)
Browse files Browse the repository at this point in the history
  • Loading branch information
kevinthecheung authored Jun 24, 2024
1 parent a50d4f1 commit 78f4b67
Show file tree
Hide file tree
Showing 2 changed files with 20 additions and 21 deletions.
20 changes: 13 additions & 7 deletions docs-go/flows.md
Original file line number Diff line number Diff line change
Expand Up @@ -156,16 +156,22 @@ first.
// ...
},
)
genkit.StartFlowServer(":1234")
err := genkit.StartFlowServer(":1234", []string{})

// startProdServer always returns a non-nil error: the one returned by
// http.ListenAndServe.
}
```

`StartFlowsServer` starts a `net/http` server that exposes each of the flows
you defined as HTTP endpoints
(for example, `http://localhost:3400/menuSuggestionFlow`).
You can optionally specify the address and port to listen on. If you don't,
the server listens on any address and the port specified by the PORT
environment variable; if that is empty, it uses the default of port 3400.
`StartFlowsServer` starts a `net/http` server that exposes your flows as HTTP
endpoints (for example, `http://localhost:3400/menuSuggestionFlow`). Both
parameters are optional:

- You can specify the address and port to listen on. If you don't,
the server listens on any address and the port specified by the PORT
environment variable; if that is empty, it uses the default of port 3400.
- You can specify which flows to serve. If you don't, `StartFlowsServer`
serves all of your defined flows.

If you want to serve flows on the same host and port as other endpoints, you
can call `NewFlowServeMux()` to get a handler for your Genkit flows, which you
Expand Down
21 changes: 7 additions & 14 deletions docs-go/models.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,13 +24,7 @@ models.

```go
projectID := os.Getenv("GCLOUD_PROJECT")
err := vertexai.Init(context.Background(), vertexai.Config{
ProjectID: projectID,
Models: []string{
"gemini-1.5-pro",
"gemini-1.5-flash",
},
})
err := vertexai.Init(context.Background(), projectID, "us-central1")
```

Note: Different plugins and models use different methods of
Expand Down Expand Up @@ -78,7 +72,7 @@ To just call the model:
request := ai.GenerateRequest{Messages: []*ai.Message{
{Content: []*ai.Part{ai.NewTextPart("Tell me a joke.")}},
}}
response, err := ai.Generate(context.Background(), gemini15pro, &request, nil)
gemini15pro.Generate(context.Background(), &request, nil)

responseText, err := response.Text()
fmt.Println(responseText)
Expand Down Expand Up @@ -114,9 +108,8 @@ Genkit supports chunked streaming of model responses:
request := ai.GenerateRequest{Messages: []*ai.Message{
{Content: []*ai.Part{ai.NewTextPart("Tell a long story about robots and ninjas.")}},
}}
response, err := ai.Generate(
response, err := gemini15pro.Generate(
context.Background(),
gemini15pro,
&request,
func(ctx context.Context, grc *ai.GenerateResponseChunk) error {
text, err := grc.Text()
Expand Down Expand Up @@ -147,7 +140,7 @@ If the model supports multimodal input, you can pass image prompts:
ai.NewMediaPart("", "data:image/jpeg;base64,"+encodedImage),
}},
}}
response, err := ai.Generate(context.Background(), gemini15pro, &request, nil)
gemini15pro.Generate(context.Background(), &request, nil)
```

<!-- TODO: gs:// wasn't working for me. HTTP? -->
Expand Down Expand Up @@ -186,7 +179,7 @@ it.
},
Tools: []*ai.ToolDefinition{myJoke},
}
response, err := ai.Generate(context.Background(), gemini15pro, &request, nil)
gemini15pro.Generate(context.Background(), &request, nil)
```

This will automatically call the tools in order to fulfill the user prompt.
Expand Down Expand Up @@ -232,7 +225,7 @@ chatbots.
}

request := ai.GenerateRequest{Messages: history}
response, err := ai.Generate(context.Background(), gemini15pro, &request, nil)
gemini15pro.Generate(context.Background(), &request, nil)
```

When you get a response, add it to the history:
Expand All @@ -252,7 +245,7 @@ chatbots.
})

request := ai.GenerateRequest{Messages: history}
response, err := ai.Generate(context.Background(), gemini15pro, &request, nil)
gemini15pro.Generate(context.Background(), &request, nil)
```

If the model you're using supports the system role, you can use the initial
Expand Down

0 comments on commit 78f4b67

Please sign in to comment.