Skip to main content
Diagram of a Prompt Object including template messages, parameters, tools, and response format

What is a prompt?

A prompt is the instruction you give an LLM. A Prompt Object bundles everything that shapes an LLM call: messages, model settings, tools, and response format into a single versioned artifact. This gives your team one shared definition to create, test, optimize, and deploy so results stay reproducible across the workflow.

What do prompts look like in Arize?

The Playground displays a prompt as a stack of messages - for example system and user - along with optional tools and model parameters. Variables are written in curly braces like {variable_name}, which get substituted with real values when your app or experiments run against the template.
Prompt editor showing system and user messages with a highlighted template variable for topic where input data is injected

What is a Prompt Object?

At Arize, we expand the definition of a prompt to a Prompt Object, which consists of everything a complete prompt entails.

Prompt template

System, user, and assistant messages that make up the LLM’s context.

Invocation parameters

Temperature, frequency penalty, top p/k, and related model output controls.

Tools and functions

Specific APIs, programmatic functions, or outside functionality the model can call.

Response format

Output schema (structured JSON and similar formats) so responses match what your application expects.
In Arize, Prompt Objects are stored in Prompt Hub, a centralized place where you and your team can manage different versions of your prompts as you iterate over time. Learn more in Save and version prompts.

Workflow

Follow this path end to end in the docs: Create a promptTest a promptOptimize a promptSave and version prompts

Next up

When you are ready to build, open Create a prompt and draft your first Prompt Object in the Playground.