Model and Key Management
Overview of model and key management in Freeplay.
Configuring Model Access
Note: Only Freeplay admin role users have permission to manage model access and keys
Freeplay has first-class support for calling models from common hosts/providers including OpenAI, Anthropic, Azure OpenAI Service, Amazon Bedrock, Amazon SageMaker, Groq, Baseten and more. These providers can be directly configured in the Freeplay UI, including configuring appropriate endpoints and API keys or other relevant credentials. These models can then be used end-to-end in the Freeplay application, including in our playground UI.
At the same time, our SDKs allow you to call any model you want and record the results with Freeplay. The Freeplay application then lets you configure those models as part of your prompt templates and experiments. An SDK example of calling other models is here.
You can control which of these models and providers your team is able to use and deploy on the Models page.
Navigate to Settings > Models to configure models for your team.
- You can disable Default models (e.g. if your team doesn't have permission to use a given provider)
- You can add add your own models and endpoints to default providers, like OpenAI fine-tuned models or Llama 3 on SageMaker
- You can control configurability on prompt templates for any other custom models you might have logged with Freeplay
API Keys & Credentials
Bringing your own keys to Freeplay
Freeplay allows customers to store their LLM provider keys with Freeplay so that Freeplay's application can route requests in the interactive Prompt Editor and for UI-driven Tests. LLM provider keys stored with Freeplay are only used for routing LLM requests, and will not be surfaced in the Freeplay dashboard or via the Freeplay API.
Security matters
Freeplay uses application-level encryption to encrypt customer LLM provider keys both at rest and in transit. Keys are only decrypted prior to routing requests to customer models. This level of encryption is supplementary to transparent data encryption provided by cloud providers. We follow industry best practices of encryption key management including regular key rotation and audit logging of all key access. More details on security here.
Customer key best practices
- Provide a unique API key with finely-scoped access for use by Freeplay. Freeplay's application only needs access to make requests to the inference endpoints for your provider.
- e.g. for OpenAI, we recommend creating a separate Project with a cost limit, and creating a key with only access to the
/v1/chat/completions
endpoint.
- e.g. for OpenAI, we recommend creating a separate Project with a cost limit, and creating a key with only access to the
- Use Freeplay's UI to rotate your LLM provider keys following your organization's guidelines. As soon as a key is updated in Freeplay's UI, the old value will be destroyed, and the new value will be used for future requests.
- Monitor usage of your API keys regularly. Freeplay does not set limits on use of customer keys other than those imposed by the LLM providers.
Configuring keys in the dashboard
You can set a default API key for a given provider by clicking on the provider name in the list. This default key will be used for most situations.

For advanced use, you can also link different keys to each endpoint for a given provider, e.g. if you want to use a different key for a fine-tuned model. Set a different key by clicking on the model name.

Configuring OpenAI Fine-Tuned Models
You can configure fine-tuned OpenAI models by navigating to Setting > Models and choosing Add fine-tuned model under the OpenAI Details heading.
Enter the name of your fine-tuned model as provided by OpenAI. You may also optionally enter a more readable display name and specify an associated API key.

Calling your Fine-Tuned Model
When creating or editing Prompt Templates you will now see fine-tuned as an option in the Model dropdown. Model Version will then populate with all your fine tuned models.

You can call your fine-tuned model from within the prompt editor as well as use your fine-tuned model in the Freeplay SDK just as you would any other OpenAI model. Keying the model name, messages and model parameters off of the prompt object.
Configuring Azure Endpoints
You can configure Azure Endpoints by going to Setting -> Models and selecting Add new endpoint under the Azure OpenAI details heading. You will be prompted to enter your Endpoint URL, Deployment ID, Model and API Key.
You can configure fine-tune Azure OpenAI models by selecting fine-tuned in the Model drop down. This will prompt you to enter your Fine-tuned model name.

Calling your Azure Endpoint
You can now use your Azure endpoints from within the Freeplay app as you would other models. Find an example of working with an Azure endpoint in the Freeplay SDK here.
Configuring Amazon Bedrock Models
By default, the Bedrock provider will use Freeplay's credentials to invoke models on Bedrock.
You can configure Amazon Bedrock models using IAM role assumption. This is a more secure method than sharing IAM user credentials directly. Learn more here.
Using Bedrock requires setting up an IAM role inside your AWS account, and allowing Freeplay to act on that account's behalf. Freeplay strongly recommends creating a unique role with minimal permissions for this purpose. The following configuration is required on that role:
Trust relationship (used to allow Freeplay to assume your role)
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "ARN_FROM_FREEPLAY"
},
"Action": "sts:AssumeRole",
"Condition": {
"StringEquals": {
"sts:ExternalId": "GENERATE_A_UNIQUE_STRING_AND_SET_IN_FREEPLAY_MODELS"
}
}
}
]
}
Permission policy (required on your role to allow Freeplay to invoke Bedrock models)
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"bedrock:InvokeModel",
"bedrock:InvokeModelWithResponseStream"
],
"Resource": "*"
}
]
}
Once these permissions are configured, use the settings > models page to configure your role for use by Freeplay in the playground, test runs, and evals.
- The IAM role should be the fully qualified ARN of your AWS role created above.
- The ExternalID should be the sts.ExternalId as configured above.

Calling Bedrock Endpoints
An example of calling Anthropic models on Bedrock using the Freeplay SDK is here.
Updated 19 days ago
Looking for more setup instructions? Check out the Quick Start guide or move onto Setup & Configuration.