@modelcontextprotocol/sdk
    Preparing search index...

    Interface CreateMessageRequest

    A request from the server to sample an LLM via the client. The client has full discretion over which model to select. The client should also inform the user before beginning sampling, to allow them to inspect the request (human in the loop) and decide whether to approve it.

    interface CreateMessageRequest {
        id: RequestId;
        jsonrpc: "2.0";
        method: "sampling/createMessage";
        params: {
            includeContext?: "none" | "thisServer" | "allServers";
            maxTokens: number;
            messages: SamplingMessage[];
            metadata?: object;
            modelPreferences?: ModelPreferences;
            stopSequences?: string[];
            systemPrompt?: string;
            temperature?: number;
        };
    }

    Hierarchy (View Summary)

    Index

    Properties

    Properties

    jsonrpc: "2.0"
    method: "sampling/createMessage"
    params: {
        includeContext?: "none" | "thisServer" | "allServers";
        maxTokens: number;
        messages: SamplingMessage[];
        metadata?: object;
        modelPreferences?: ModelPreferences;
        stopSequences?: string[];
        systemPrompt?: string;
        temperature?: number;
    }

    Type Declaration

    • OptionalincludeContext?: "none" | "thisServer" | "allServers"

      A request to include context from one or more MCP servers (including the caller), to be attached to the prompt. The client MAY ignore this request.

    • maxTokens: number

      The requested maximum number of tokens to sample (to prevent runaway completions).

      The client MAY choose to sample fewer tokens than the requested maximum.

    • messages: SamplingMessage[]
    • Optionalmetadata?: object

      Optional metadata to pass through to the LLM provider. The format of this metadata is provider-specific.

    • OptionalmodelPreferences?: ModelPreferences

      The server's preferences for which model to select. The client MAY ignore these preferences.

    • OptionalstopSequences?: string[]
    • OptionalsystemPrompt?: string

      An optional system prompt the server wants to use for sampling. The client MAY modify or omit this prompt.

    • Optionaltemperature?: number

      number