client.prompts namespace.
Methods Overview
Some SDKs will use camel case rather than snake case depending on convention for the given language| Method Name | Parameters | Description |
|---|---|---|
get_formatted | project_id: string template_name: string environment: string | Get a formatted prompt template object with variables inserted, messages formatted for the configured LLM provider (e.g. OpenAI, Anthropic), and model parameters for the LLM call. |
get | project_id: string template_name: string environment: string | Get a prompt template by environment.Note: The prompt template will not have variables substituted or be formatted for the configured LLM provider. |
Get a Formatted Prompt
Get a formatted prompt object with variables inserted, messages formatted for the associated LLM provider, and model parameters to use for the LLM call. This is the most convenient method for most prompt fetch use cases given that formatting is handled for you server side in Freeplay.The
environment parameter determines which version of your prompt is fetched. Default environment names are:"prod"- Production environment"dev"- Development environment"latest"- Latest version (use for development/testing)
Get a Prompt Template
Get a prompt template object that does not yet have variables inserted and has messages formatted in consistent LLM provider agnostic structure. It is particularly useful when you want to reuse a prompt template with different variables in the same execution path, like Test Runs. This method gives you more control to handle formatting in your own code rather than server side in Freeplay, but requires a few more lines of code.Using History with Prompt Templates
history is a special object in Freeplay prompt templates for managing state over multiple LLM interactions by passing in previous messages. It accepts an array of prior messages when relevant.
Before using history in the SDK, you must configure it on your prompt template. See more details in the Multi-Turn Chat Support section.
Once you have history configured for a prompt template, you can pass it during the formatting process. The history messages will be inserted wherever you have your history placeholder in your prompt template.
history in the context of a multi-turn chatbot application here

