Skip to main content

pb.adapters.create

pb.adapters.create

Create a new adapter by starting a new fine-tuning job

Parameters:

   config: FinetuningConfig, default None

   dataset: str, default None
The dataset to use for fine-tuning

   repo: str, default None
Name of the adapter repo to store the newly created adapter

   description: str, default None
Description for the adapter

Returns:

   Adapter

Examples:

Example 1: Create an adapter with defaults

# Create an adapter repository
repo = pb.repos.create(name="news-summarizer-model", description="TLDR News Summarizer Experiments", exists_ok=True)

# Start a fine-tuning job, blocks until training is finished
adapter = pb.adapters.create(
config=FinetuningConfig(
base_model="mistral-7b"
),
dataset="tldr_news",
repo=repo,
description="initial model with defaults"
)

Example 2: Create a new adapter with customized parameters

# Create an adapter repository
repo = pb.repos.create(name="news-summarizer-model", description="TLDR News Summarizer Experiments", exists_ok=True)

# Start a fine-tuning job with custom parameters, blocks until training is finished
adapter = pb.adapters.create(
config=FinetuningConfig(
base_model="mistral-7b",
epochs=1, # default: 3
rank=8, # default: 16
learning_rate=0.0001 # default: 0.0002
target_modules=["q_proj", "v_proj", "k_proj"], # default: None (infers [q_proj, v_proj] for mistral-7b)
),
dataset="tldr_news",
repo=repo,
description="changing epochs, rank, and learning rate"
)