Querying Models
Learn how to query models using the Python SDK or REST API
Predibase offers two deployment categories:
- Private Deployments - Dedicated resources with guaranteed availability, recommended for production use
- Shared Endpoints - Pre-deployed models for quick experimentation and development
You can query these deployments using either the standard Predibase method or the OpenAI-compatible method.
Predibase Method
Python SDK
First, install the Predibase Python SDK:
Then initialize the client and start generating:
REST API
You can also use our REST API directly:
OpenAI-Compatible Method
Predibase supports OpenAI Chat Completions v1 compatible endpoints that make it easy to migrate from OpenAI to Predibase.
Python SDK
Use the OpenAI Python SDK with Predibase’s endpoints:
REST API
Use the OpenAI-compatible REST API:
Function Calling
Function calling allows models to interact with external tools and APIs in a structured way. To do function calling with Predibase deployments and/or adapters, define your functions and include them in your requests with the OpenAI Chat Completions v1 SDK method:
Structured Output
Predibase endpoints allow you to enforce that responses contain only valid JSON and adhere to a provided schema.
The schema can be provided either using JSON Schema (REST, Python) or Pydantic (Python).
Using Pydantic (Python SDK)
Using JSON Schema (REST API)
Complex Schemas
You can define more complex schemas with nested objects and arrays: