You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+81-10Lines changed: 81 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@
5
5
This repository showcases example UI components to be used with the Apps SDK, as well as example MCP servers that expose a collection of components as tools.
6
6
It is meant to be used as a starting point and source of inspiration to build your own apps for ChatGPT.
7
7
8
-
## MCP + Apps SDK overview
8
+
## MCP + Apps SDK Overview
9
9
10
10
The Model Context Protocol (MCP) is an open specification for connecting large language model clients to external tools, data, and user interfaces. An MCP server exposes tools that a model can call during a conversation and returns results according to the tool contracts. Those results can include extra metadata—such as inline HTML—that the Apps SDK uses to render rich UI components (widgets) alongside assistant messages.
11
11
@@ -19,7 +19,7 @@ Because the protocol is transport agnostic, you can host the server over Server-
19
19
20
20
The MCP servers in this demo highlight how each tool can light up widgets by combining structured payloads with `_meta.openai/outputTemplate` metadata returned from the MCP servers.
21
21
22
-
## Repository structure
22
+
## Repository Structure
23
23
24
24
-`src/` – Source for each widget example.
25
25
-`assets/` – Generated HTML, JS, and CSS bundles after running the build step.
@@ -52,14 +52,22 @@ The components are bundled into standalone assets that the MCP servers serve as
52
52
pnpm run build
53
53
```
54
54
55
-
This command runs `build-all.mts`, producing versioned `.html`, `.js`, and `.css` files inside `assets/`. Each widget is wrapped with the CSS it needs so you can host the bundles directly or ship them with your own server.
55
+
This command runs `build-all.mts`, producing versioned `.html`, `.js`, and `.css` files inside `assets/`. Each widget is wrapped with the CSS it needs so you can host the bundles directly or ship them with your own server. If the local assets are missing at runtime, the Pizzaz MCP server automatically falls back to the CDN bundles (version `0038`).
56
56
57
57
To iterate locally, you can also launch the Vite dev server:
58
58
59
59
```bash
60
60
pnpm run dev
61
61
```
62
62
63
+
The Vite config binds to `http://127.0.0.1:4444` by default. Need another host or port? Pass CLI overrides (for example, to expose on all interfaces at `4000`):
64
+
65
+
```bash
66
+
pnpm run dev --host 0.0.0.0 --port 4000
67
+
```
68
+
69
+
If you change the origin, update the MCP server `.env` (`DOMAIN=<new-origin>`) so widgets resolve correctly.
70
+
63
71
## Serve the static assets
64
72
65
73
If you want to preview the generated bundles without the MCP servers, start the static file server after running a build:
@@ -68,6 +76,14 @@ If you want to preview the generated bundles without the MCP servers, start the
68
76
pnpm run serve
69
77
```
70
78
79
+
This static server also defaults to port `4444`. Override it when needed:
80
+
81
+
```bash
82
+
pnpm run serve -p 4000
83
+
```
84
+
85
+
Make sure the MCP server `DOMAIN` matches the port you choose.
86
+
71
87
The assets are exposed at [`http://localhost:4444`](http://localhost:4444) with CORS enabled so that local tooling (including MCP inspectors) can fetch them.
72
88
73
89
## Run the MCP servers
@@ -79,31 +95,66 @@ The repository ships several demo MCP servers that highlight different widget bu
79
95
80
96
Every tool response includes plain text content, structured JSON, and `_meta.openai/outputTemplate` metadata so the Apps SDK can hydrate the matching widget.
81
97
98
+
Each MCP server reads `ENVIRONMENT`, `DOMAIN`, and `PORT` from a `.env` file located in its own directory (`pizzaz_server_node/.env`, `pizzaz_server_python/.env`, `solar-system_server_python/.env`). Instead of exporting shell variables, create or update the `.env` file beside the server you're running. For example, inside `pizzaz_server_node/.env`:
99
+
100
+
```env
101
+
# Development: consume Vite dev assets on http://localhost:5173
102
+
ENVIRONMENT=local
103
+
104
+
# Production-style: point to the static asset server started with `pnpm run serve`
105
+
# ENVIRONMENT=production
106
+
# DOMAIN=http://localhost:4444
107
+
108
+
# Port override (defaults to 8000 when omitted)
109
+
# PORT=8123
110
+
```
111
+
112
+
- Use `ENVIRONMENT=local` while `pnpm run dev` is serving assets so widgets load without hash suffixes.
113
+
- Switch to `ENVIRONMENT=production` and set `DOMAIN` after running `pnpm run build` and `pnpm run serve` to reference the static bundles.
114
+
- Adjust `PORT` if you need the MCP endpoint on something other than `http://localhost:8000/mcp`.
Prefer invoking uvicorn directly? From the repository root you can run `uvicorn pizzaz_server_python.main:app --port 8000` once dependencies are installed.
138
+
139
+
> Prefer pnpm scripts? After activating the virtual environment, return to the repository root (for example `cd ..`) and run `pnpm start:pizzaz-python`.
Prefer invoking uvicorn directly? From the repository root you can run `uvicorn solar-system_server_python.main:app --port 8000` once dependencies are installed.
155
+
156
+
> Similarly, once the virtual environment is active, head back to the repository root and run `pnpm start:solar-python` to use the wrapper script.
157
+
107
158
You can reuse the same virtual environment for all Python servers—install the dependencies once and run whichever entry point you need.
108
159
109
160
## Testing in ChatGPT
@@ -112,15 +163,35 @@ To add these apps to ChatGPT, enable [developer mode](https://platform.openai.co
112
163
113
164
To add your local server without deploying it, you can use a tool like [ngrok](https://ngrok.com/) to expose your local server to the internet.
114
165
115
-
For example, once your mcp servers are running, you can run:
166
+
For example, once your MCP servers are running, you can run:
116
167
117
168
```bash
118
169
ngrok http 8000
119
170
```
120
171
121
-
You will get a public URL that you can use to add your local server to ChatGPT in Settings > Connectors.
172
+
Use the generated URL (for example `https://<custom_endpoint>.ngrok-free.app/mcp`) when configuring ChatGPT. All of the demo servers listen on `http://localhost:8000/mcp` by default; adjust the port in the command above if you override it.
173
+
174
+
### Hot-swap modes without reconnecting
175
+
176
+
You can swap between CDN, static builds, and the Vite dev server without reconfiguring ChatGPT:
177
+
178
+
1. Change the environment you care about (edit the relevant `.env`, run `pnpm run dev`, or rebuild assets and rerun the MCP server).
179
+
2. In ChatGPT, open **Settings → Apps & Connectors →** select your connected app → **Actions → Refresh app**.
180
+
3. Continue the conversation, no reconnects or page reloads are needed.
181
+
182
+
When switching modes, avoid disconnecting the connector, deleting it, launching a brand-new tunnel, or refreshing the ChatGPT conversation tab. After you hit **Refresh app**, ChatGPT keeps the existing MCP base URL and simply pulls the latest widget HTML/CSS/JS strategy from your server.
183
+
184
+
| Mode | What you change | Typical `.env`|
185
+
| --- | --- | --- |
186
+
| CDN (easiest) | Nothing beyond the MCP server | (leave `PORT`, `ENVIRONMENT` & `DOMAIN` unset) |
187
+
| Static serve (inline bundles) |`pnpm run build` (optionally `pnpm run serve` to inspect) |`ENVIRONMENT=production` / `PORT=8000`|
188
+
| Dev (Vite hot reload) | Run `pnpm run dev` and point your MCP server at it |`ENVIRONMENT=local` / `DOMAIN=http://127.0.0.1:4444` / `PORT=8000`|
189
+
190
+
#### Working inside virtual machines
191
+
192
+
For the smoothest loop, keep everything inside the same VM: run Vite or the static server, the MCP server, ngrok, and your ChatGPT browser session together so localhost resolves correctly. If your browser lives on the host machine while servers stay in the VM, either tunnel the frontend as well (for example, a second `ngrok http 4444` plus `DOMAIN=<that URL>`), or expose the VM via an HTTPS-accessible IP and point `DOMAIN` there.
122
193
123
-
For example: `https://<custom_endpoint>.ngrok-free.app/mcp`
This directory contains a minimal Model Context Protocol (MCP) server implemented with the official TypeScript SDK. The server exposes the full suite of Pizzaz demo widgets so you can experiment with UI-bearing tools in ChatGPT developer mode.
3
+
This directory contains a minimal Model Context Protocol (MCP) server implemented with the official TypeScript SDK. The service exposes the five Pizzaz demo widgets and shares configuration with the rest of the workspace: it reads environment flags from a local `.env` file and automatically falls back to the published CDN bundles when local assets are unavailable.
4
4
5
5
## Prerequisites
6
6
@@ -13,20 +13,54 @@ This directory contains a minimal Model Context Protocol (MCP) server implemente
13
13
pnpm install
14
14
```
15
15
16
-
If you prefer npm or yarn, adjust the command accordingly.
16
+
Adjust the command if you prefer npm or yarn.
17
17
18
18
## Run the server
19
19
20
20
```bash
21
21
pnpm start
22
22
```
23
23
24
-
The script bootstraps the server over SSE (Server-Sent Events), which makes it compatible with the MCP Inspector as well as ChatGPT connectors. Once running you can list the tools and invoke any of the pizza experiences.
24
+
This launches an HTTP MCP server on `http://localhost:8000/mcp`with two endpoints:
25
25
26
-
Each tool responds with:
26
+
-`GET /mcp` provides the SSE stream.
27
+
-`POST /mcp/messages?sessionId=...` accepts follow-up messages for active sessions.
27
28
28
-
-`content`: a short text confirmation that mirrors the original Pizzaz examples.
29
-
-`structuredContent`: a small JSON payload that echoes the topping argument, demonstrating how to ship data alongside widgets.
30
-
-`_meta.openai/outputTemplate`: metadata that binds the response to the matching Skybridge widget shell.
29
+
Configuration lives in `.env` within this directory (loaded automatically via `dotenv`). Update it before starting the server to control asset origins and ports. A typical file looks like:
31
30
32
-
Feel free to extend the handlers with real data sources, authentication, and persistence.
31
+
```env
32
+
# Use the Vite dev server started with `pnpm run dev`
33
+
ENVIRONMENT=local
34
+
35
+
# After `pnpm run build && pnpm run serve`, point to the static bundles
36
+
# ENVIRONMENT=production
37
+
# DOMAIN=http://localhost:4444
38
+
39
+
# Change the default port (defaults to 8000)
40
+
# PORT=8123
41
+
```
42
+
43
+
Key behaviors:
44
+
45
+
- When `ENVIRONMENT=local`, widgets load from the Vite dev server (`pnpm run dev` from the repo root) without hashed filenames.
46
+
- When `ENVIRONMENT=production` and `DOMAIN` is set, widgets are served from your local static server (typically `pnpm run serve`).
47
+
- When `ENVIRONMENT` is omitted entirely—or neither local option provides assets—the server falls back to the CDN bundles (version `0038`).
48
+
49
+
The script boots the server with an SSE transport, which makes it compatible with the MCP Inspector as well as ChatGPT connectors. Once running you can list the tools and invoke any of the pizza experiences.
50
+
- Each tool emits:
51
+
- `content`: confirmation text matching the requested action.
52
+
- `structuredContent`: JSON reflecting the requested topping.
53
+
- `_meta.openai/outputTemplate`: metadata binding the response to the Skybridge widget.
54
+
55
+
### Hot-swap reminder
56
+
57
+
After changing `.env`, rebuilding assets, or toggling between dev/static/CDN, open your ChatGPT connector (**Settings → Apps & Connectors → [your app] → Actions → Refresh app**). That keeps the same MCP URL, avoids new ngrok tunnels, and prompts ChatGPT to fetch the latest widget templates. See the root [README](../README.md#hot-swap-modes-without-reconnecting) for the mode cheat sheet and VM tips.
58
+
59
+
## Next Steps
60
+
61
+
Extend these handlers with real data sources, authentication, or localization, and customize the widget configuration under `src/` to align with your application.
0 commit comments