|
# Getting Started with LLMPromptKit |
|
|
|
This guide will help you get started with LLMPromptKit, a comprehensive library for managing LLM prompts. |
|
|
|
## Installation |
|
|
|
```bash |
|
pip install llmpromptkit |
|
``` |
|
|
|
## Basic Usage |
|
|
|
### Initialize Components |
|
|
|
```python |
|
from llmpromptkit import PromptManager, VersionControl, PromptTesting, Evaluator |
|
|
|
# Initialize with default storage location |
|
prompt_manager = PromptManager() |
|
|
|
# Or specify a custom storage location |
|
# prompt_manager = PromptManager("/path/to/storage") |
|
|
|
# Initialize other components |
|
version_control = VersionControl(prompt_manager) |
|
testing = PromptTesting(prompt_manager) |
|
evaluator = Evaluator(prompt_manager) |
|
``` |
|
|
|
### Create and Manage Prompts |
|
|
|
```python |
|
# Create a prompt |
|
prompt = prompt_manager.create( |
|
content="Translate the following text from {source_language} to {target_language}: {text}", |
|
name="Translation Prompt", |
|
description="A prompt for translating text between languages", |
|
tags=["translation", "multilingual"] |
|
) |
|
|
|
# The prompt.id property contains a unique identifier (e.g., "a1b2c3d4e5") |
|
prompt_id = prompt.id |
|
|
|
# Get a prompt by ID |
|
retrieved_prompt = prompt_manager.get(prompt_id) |
|
|
|
# Update a prompt |
|
prompt_manager.update( |
|
prompt_id, |
|
content="Please translate the following text from {source_language} to {target_language}:\n\n{text}" |
|
) |
|
|
|
# Search prompts by tags |
|
translation_prompts = prompt_manager.search_by_tags(["translation"]) |
|
|
|
# List all prompts |
|
all_prompts = prompt_manager.list_all() |
|
``` |
|
|
|
### Version Control |
|
|
|
```python |
|
# Create a version snapshot |
|
version_control.commit( |
|
prompt_id=prompt_id, |
|
commit_message="Initial version" |
|
) |
|
|
|
# Update the prompt and create another version |
|
prompt_manager.update( |
|
prompt_id, |
|
content="Please provide a translation of the following text from {source_language} to {target_language}:\n\n{text}\n\nMaintain the original formatting and tone." |
|
) |
|
|
|
version_control.commit( |
|
prompt_id=prompt_id, |
|
commit_message="Added formatting instructions" |
|
) |
|
|
|
# List all versions |
|
versions = version_control.list_versions(prompt_id) |
|
|
|
# Compare versions |
|
diff = version_control.diff(prompt_id, 1, 2) |
|
|
|
# Revert to a previous version |
|
version_control.checkout(prompt_id, 1) |
|
``` |
|
|
|
### Using Prompts with Variables |
|
|
|
```python |
|
# Get a prompt |
|
prompt = prompt_manager.get(prompt_id) |
|
|
|
# Render with variables |
|
rendered_prompt = prompt.render( |
|
source_language="English", |
|
target_language="Spanish", |
|
text="Hello, how are you today?" |
|
) |
|
|
|
# Now use rendered_prompt with your LLM API |
|
``` |
|
|
|
## Next Steps |
|
|
|
- See the [CLI Usage](./cli_usage.md) guide for command-line operations |
|
- Explore [Advanced Features](./advanced_features.md) for templating and custom metrics |
|
- Check [Integration Examples](./integration_examples.md) for real-world use cases |
|
|