Primary navigation

Reference

API and SDK reference for Apps SDK.

window.openai component bridge

See build a ChatGPT UI.

File APIs

APIPurposeNotes
window.openai.uploadFile(file)Upload a user-selected file and receive a fileId.Supports image/png, image/jpeg, image/webp.
window.openai.getFileDownloadUrl({ fileId })Request a temporary download URL for a file.Only works for files uploaded by the widget or passed via file params.

When persisting widget state, use the structured shape (modelContent, privateContent, imageIds) if you want the model to see image IDs during follow-up turns.

Tool descriptor parameters

Need more background on these fields? Check the Advanced section of the MCP server guide.

By default, a tool description should include the fields listed here.

_meta fields on tool descriptor

We have also require the following _meta fields on the tool descriptor:

KeyPlacementTypeLimitsPurpose
_meta["securitySchemes"]Tool descriptorarrayBack-compat mirror for clients that only read _meta.
_meta["openai/outputTemplate"]Tool descriptorstring (URI)Resource URI for component HTML template (text/html+skybridge).
_meta["openai/widgetAccessible"]Tool descriptorbooleandefault falseAllow component→tool calls through the client bridge.
_meta["openai/visibility"]Tool descriptorstringpublic (default) or privateHide a tool from the model while keeping it callable from the widget.
_meta["openai/toolInvocation/invoking"]Tool descriptorstring≤ 64 charsShort status text while the tool runs.
_meta["openai/toolInvocation/invoked"]Tool descriptorstring≤ 64 charsShort status text after the tool completes.
_meta["openai/fileParams"]. Tool descriptorstring[]List of top-level input fields that represent files (object shape { download_url, file_id }).

Example:

server.registerTool(
  "search",
  {
    title: "Public Search",
    description: "Search public documents.",
    inputSchema: {
      type: "object",
      properties: { q: { type: "string" } },
      required: ["q"],
    },
    securitySchemes: [
      { type: "noauth" },
      { type: "oauth2", scopes: ["search.read"] },
    ],
    _meta: {
      securitySchemes: [
        { type: "noauth" },
        { type: "oauth2", scopes: ["search.read"] },
      ],
      "openai/outputTemplate": "ui://widget/story.html",
      "openai/toolInvocation/invoking": "Searching…",
      "openai/toolInvocation/invoked": "Results ready",
    },
  },
  async ({ q }) => performSearch(q)
);

Annotations

To label a tool as “read-only”, please use the following annotation on the tool descriptor:

KeyTypeRequiredNotes
readOnlyHintbooleanRequiredSignal that the tool is read-only. ChatGPT can skip “Are you sure?” prompts when this is true.
destructiveHintbooleanRequiredDeclare that the tool may delete or overwrite user data so ChatGPT knows to elicit explicit approval first.
openWorldHintbooleanRequiredDeclare that the tool publishes content or reaches outside the current user’s account, prompting the client to summarize the impact before asking for approval.
idempotentHintbooleanOptionalDeclare that calling the tool repeatedly with the same arguments will have no additional effect on its environment.

These hints only influence how ChatGPT frames the tool call to the user; servers must still enforce their own authorization logic.

Example:

server.registerTool(
  "list_saved_recipes",
  {
    title: "List saved recipes",
    description: "Returns the user’s saved recipes without modifying them.",
    inputSchema: {
      type: "object",
      properties: {},
      additionalProperties: false,
    },
    annotations: { readOnlyHint: true },
  },
  async () => fetchSavedRecipes()
);

Need more background on these fields? Check the Advanced section of the MCP server guide.

Component resource _meta fields

Additional detail on these resource settings lives in the Advanced section of the MCP server guide.

Set these keys on the resource template that serves your component (registerResource). They help ChatGPT describe and frame the rendered iframe without leaking metadata to other clients.

KeyPlacementTypePurpose
_meta["openai/widgetDescription"]Resource contentsstringHuman-readable summary surfaced to the model when the component loads, reducing redundant assistant narration.
_meta["openai/widgetPrefersBorder"]Resource contentsbooleanHint that the component should render inside a bordered card when supported.
_meta["openai/widgetCSP"]Resource contentsobjectDefine allowlists for the widget: connect_domains (network requests), resource_domains (images, fonts, scripts), optional frame_domains (iframe sources), and optional redirect_domains (openExternal redirect targets).
_meta["openai/widgetDomain"]Resource contentsstring (origin)Optional dedicated subdomain for hosted components (defaults to https://web-sandbox.oaiusercontent.com).

The openai/widgetCSP object supports:

  • connect_domains: string[] – domains the widget may contact via fetch/XHR.
  • resource_domains: string[] – domains for static assets (images, fonts, scripts, styles).
  • frame_domains?: string[] – optional list of origins allowed for iframe embeds. By default, widgets cannot render subframes; adding frame_domains opts in to iframe usage and triggers stricter app review.
  • redirect_domains?: string[] – optional list of origins that can receive openExternal redirects without the safe-link modal. When the destination matches, ChatGPT appends a redirectUrl query parameter pointing back to the current conversation.

Tool results

The Advanced section of the MCP server guide provides more guidance on shaping these response fields.

Tool results can contain the following fields. Notably:

KeyTypeRequiredNotes
structuredContentobjectOptionalSurfaced to the model and the component. Must match the declared outputSchema, when provided.
contentstring or Content[]OptionalSurfaced to the model and the component.
_metaobjectOptionalDelivered only to the component. Hidden from the model.

Only structuredContent and content appear in the conversation transcript. _meta is forwarded to the component so you can hydrate UI without exposing the data to the model.

Host-provided tool result metadata:

KeyPlacementTypePurpose
_meta["openai/widgetSessionId"]Tool result _meta (from host)stringStable ID for the currently mounted widget instance; use it to correlate logs and tool calls until the widget unmounts.

Example:

server.registerTool(
  "get_zoo_animals",
  {
    title: "get_zoo_animals",
    inputSchema: { count: z.number().int().min(1).max(20).optional() },
    _meta: { "openai/outputTemplate": "ui://widget/widget.html" },
  },
  async ({ count = 10 }) => {
    const animals = generateZooAnimals(count);

    return {
      structuredContent: { animals },
      content: [{ type: "text", text: `Here are ${animals.length} animals.` }],
      _meta: {
        allAnimalsById: Object.fromEntries(
          animals.map((animal) => [animal.id, animal])
        ),
      },
    };
  }
);

Error tool result

To return an error on the tool result, use the following _meta key:

KeyPurposeTypeNotes
_meta["mcp/www_authenticate"]Error resultstring or string[]RFC 7235 WWW-Authenticate challenges to trigger OAuth.

_meta fields the client provides

See the Advanced section of the MCP server guide for broader context on these client-supplied hints.

KeyWhen providedTypePurpose
_meta["openai/locale"]Initialize + tool callsstring (BCP 47)Requested locale (older clients may send _meta["webplus/i18n"]).
_meta["openai/userAgent"]Tool callsstringUser agent hint for analytics or formatting.
_meta["openai/userLocation"]Tool callsobjectCoarse location hint (city, region, country, timezone, longitude, latitude).
_meta["openai/subject"]Tool callsstringAnonymized user id sent to MCP servers for the purposes of rate limiting and identification

Operation-phase _meta["openai/userAgent"] and _meta["openai/userLocation"] are hints only; servers should never rely on them for authorization decisions and must tolerate their absence.

Example:

server.registerTool(
  "recommend_cafe",
  {
    title: "Recommend a cafe",
    inputSchema: { type: "object" },
  },
  async (_args, { _meta }) => {
    const locale = _meta?.["openai/locale"] ?? "en";
    const location = _meta?.["openai/userLocation"]?.city;

    return {
      content: [{ type: "text", text: formatIntro(locale, location) }],
      structuredContent: await findNearbyCafes(location),
    };
  }
);