Persistent Storage
Configure persistence for AI conversations with OpenResponses
Persistent Storage
OpenResponses provides configurable storage mechanisms for AI responses and their associated input items. This enables you to maintain conversation history and reference previous interactions in subsequent requests.
Key Benefits
Unlike OpenAI’s managed service, our Response Store implementation offers:
- Complete Data Control: Store all your conversation data on your own infrastructure, ensuring data sovereignty and compliance with your organization’s data policies.
- Flexible Storage Options: Choose between in-memory storage for development or MongoDB for persistent storage in production environments.
- Customizable Retention: Define your own data retention policies without being constrained by OpenAI’s limits.
- Integration Flexibility: Easily integrate with your existing systems and data pipelines for analytics, auditing, or other business processes.
- Cost Management: Potentially reduce costs by optimizing storage and retrieval according to your specific usage patterns.
Storage Options
OpenResponses supports two storage implementations:
In-Memory Storage
Default option for development and testing. Fast but non-persistent.
MongoDB Storage
Production-ready solution for persistent storage of conversations.
Controlling Storage
Use the store
parameter in your API requests to control whether responses are stored:
- When
store=true
, the response and its inputs are saved in the configured store. - When
store=false
or omitted, no data is stored regardless of configuration.
Configuration
In-Memory Store (Default)
The in-memory store is enabled by default and requires no additional configuration. It’s suitable for development and testing environments, but data will be lost when the application restarts.
To explicitly configure the in-memory store:
MongoDB Store
For production environments where persistence is required, the MongoDB store provides durable storage of response data.
To enable the MongoDB store:
- Add the following properties to your configuration:
- Ensure you have MongoDB installed and running, or provide the connection string to your MongoDB instance.
You can also configure the response store using environment variables:
Using with Docker Compose
To run OpenResponses with MongoDB for persistent storage:
API Usage Examples
Storing a New Conversation
Sample Response
When you make a request to store a conversation, you’ll receive a response that includes a unique ID for that response. This ID can be used to continue the conversation later:
Continuing a Stored Conversation
To reference a previously stored conversation, use the response ID:
Python SDK Example
- Usage Example Python OpenAI SDK example can be found here example
Best Practices
- Only enable storage (
store=true
) for conversations that need persistence. - For ephemeral interactions, omit the
store
parameter or set it tofalse
. - Use MongoDB storage in production for data durability.
- Implement appropriate backup strategies for your MongoDB database.
- Consider GDPR and other data protection regulations when storing conversation data.