Prompt Management
Overview of prompt version control, environments, and deploy process.
Introduction
Freeplay simplifies the process of iterating and testing different versions of your prompts or prompt chains.
Our system gives you a structured way to manage and deploy different versions of "prompt templates" — inclusive of the prompt text, LLM provider/model, and request parameters that together control LLM completions. You control which version is deployed to each environment for your application (i.e. dev/staging/prod). Each completion is then recorded back to Freeplay so you can track and analyze completions from a specific version, and compare different versions to each other.
This guide will walk you through managing the iteration process and controlling which versions of your prompts are active in various environments.
Benefits of Versioning and Environment Management Using Freeplay
- Flexibility: Easily switch between different versions of your Prompt Templates for different use cases or stages in your development process.
- Experimentation Quickly experiment with prompt and model config changes against real data in an interactive, in-app editor.
- Control: Maintain control over what gets deployed where, ensuring that production environments are stable and only receive thoroughly tested changes.
- Safety: Provides a safety net, allowing you to revert to previous versions if a newly deployed change doesn't perform as expected.
By leveraging the versioning and environment features in Freeplay, you can manage your Prompt Templates with precision, ensuring that the right prompts are used in the right context at the right time.
What is a Prompt Template?
A "prompt template" in the Freeplay platform is a pre-defined structure for creating prompts that are used to interact with Large Language Models (LLMs). It serves as a blueprint for your requests and responses with the model. Here are the key components of a Prompt Template:
-
Name: Start by naming your prompt template. This name should provide a quick reference to what the template is meant for. Note that it will be referenced in code, and therefore not easily changed.
-
Content: The core of the prompt template is its content. This includes the actual text or instructions that you will be sending to the LLM as a prompt. Note that since it's a template, your content should include one or more Input Variables to be populated by your code.
-
Model Selection: Alongside the content, you'll select the LLM that your Prompt Template is intended for. This could be from a variety of providers like OpenAI, Anthropic, Azure, etc.
-
Parameters: Additionally, you can specify various request parameters that influence how the model interacts with your prompt. These might include settings like response length (tokens), temperature, or any other model-specific parameters that impact the output.
Note that prompt templates in Freeplay are automatically versioned. Each time you modify and save a prompt template, a new version is created. This allows you to experiment with changes and iterate over time while maintaining a history of past versions.
Prompt templates are a powerful feature in Freeplay, allowing you to standardize and streamline your interactions with LLMs, ensuring consistency, efficiency, and ease of iteration in your development process.
An Example to Illustrate
Let's walk through an example to better understand how a prompt template works within the Freeplay platform. In this instance, we'll create a template named "Transform News Article to a Tweet".
Name: "Transform News Article to a Tweet"
This name clearly indicates the purpose of the prompt template - converting the content of a news article into a concise tweet.
Content:
The content of this template instructs the Large Language Model (LLM) to take the text of a news article and generate a short tweet summarizing it. The key part here is the input variable {{article_text}}
. When making the SDK call, the actual text of the news article will be read from our code and passed into this variable.

Model: GPT-4
For this task, we're choosing GPT-4, known for its advanced language understanding and generation capabilities.
Parameters:
- Temperature: Default setting (allows for a balance between randomness and predictability in the response).
- Max Tokens: Default setting (ensures the tweet generated is concise and fits within the typical character limits of a tweet).

By setting up this Prompt Template in Freeplay, we have established a standardized way to consistently convert news articles into tweets using GPT-4. This template can be iterated upon and improved over time, with version control allowing us to track changes and revert if necessary.
Versioning & Environments
The Freeplay platform offers robust versioning and environment management capabilities to ensure that you have the utmost flexibility and control over your Prompt Templates. Here's how it works:
Prompt Template Versions
Whenever you make edits and save a prompt template within the Freeplay dashboard, the system automatically generates a new version. This applies to all modifications, whether you're altering the model, tweaking the content, adjusting parameters, or making any other changes. This versioning system ensures that each iteration is captured and preserved, allowing for precise tracking and management of your prompt templates.

Default Behavior: The 'latest' Environment
- By default, the Freeplay SDK is designed to pull the Prompt Template from the
latest
environment. This means that unless you specify otherwise, the SDK will always use the most recently saved version of your Prompt Template. - This default behavior ensures that you're always working with the latest iterations of your prompts, making it easy to implement changes and updates quickly.
Specifying Environments
- While the default is convenient, there may be situations where you need more control. In such cases, you can specify which environment you want to pull from in your code. This is done using the syntax
environment='<environment_name>'
. - A common pattern is to set up environments like
dev
,staging
, andprod
. This allows you to have different versions of your Prompt Templates for different stages of your development and deployment process.- Dev Environment: Ideal for development and testing. Changes can be made and tested rapidly without affecting other environments.
- Staging Environment: Acts as a pre-production environment. It's a final testing ground before pushing changes to production.
- Prod Environment: The production environment. Changes pushed here are live and affect the end users.
Editing and Deploying Prompt Templates
- When you make changes to a Prompt Template, you'll encounter the "Save & Deploy" option. This action will save the updated template to the
latest
environment by default. - However, you have the flexibility to specify other environments to which you'd like to deploy the updated template. This ensures that you can manage and deploy different versions of your prompts to different environments seamlessly.

Note that you can also access and manage your Environments from Settings > Environments.
Updated 5 months ago
Now that we've set up a robust Prompt Template and learned about Version Control and Environments, let's set up some Evaluations.