File size: 2,770 Bytes
6f40440
e54fd17
6f40440
e54fd17
 
 
 
6f40440
e54fd17
 
 
 
 
 
 
6f40440
e54fd17
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
# Getting Started with LLMPromptKit

This guide will help you get started with LLMPromptKit, a comprehensive library for managing LLM prompts.

## Installation

```bash
pip install llmpromptkit
```

## Basic Usage

### Initialize Components

```python
from llmpromptkit import PromptManager, VersionControl, PromptTesting, Evaluator

# Initialize with default storage location
prompt_manager = PromptManager()

# Or specify a custom storage location
# prompt_manager = PromptManager("/path/to/storage")

# Initialize other components
version_control = VersionControl(prompt_manager)
testing = PromptTesting(prompt_manager)
evaluator = Evaluator(prompt_manager)
```

### Create and Manage Prompts

```python
# Create a prompt
prompt = prompt_manager.create(
    content="Translate the following text from {source_language} to {target_language}: {text}",
    name="Translation Prompt",
    description="A prompt for translating text between languages",
    tags=["translation", "multilingual"]
)

# The prompt.id property contains a unique identifier (e.g., "a1b2c3d4e5")
prompt_id = prompt.id

# Get a prompt by ID
retrieved_prompt = prompt_manager.get(prompt_id)

# Update a prompt
prompt_manager.update(
    prompt_id,
    content="Please translate the following text from {source_language} to {target_language}:\n\n{text}"
)

# Search prompts by tags
translation_prompts = prompt_manager.search_by_tags(["translation"])

# List all prompts
all_prompts = prompt_manager.list_all()
```

### Version Control

```python
# Create a version snapshot
version_control.commit(
    prompt_id=prompt_id,
    commit_message="Initial version"
)

# Update the prompt and create another version
prompt_manager.update(
    prompt_id,
    content="Please provide a translation of the following text from {source_language} to {target_language}:\n\n{text}\n\nMaintain the original formatting and tone."
)

version_control.commit(
    prompt_id=prompt_id,
    commit_message="Added formatting instructions"
)

# List all versions
versions = version_control.list_versions(prompt_id)

# Compare versions
diff = version_control.diff(prompt_id, 1, 2)

# Revert to a previous version
version_control.checkout(prompt_id, 1)
```

### Using Prompts with Variables

```python
# Get a prompt
prompt = prompt_manager.get(prompt_id)

# Render with variables
rendered_prompt = prompt.render(
    source_language="English",
    target_language="Spanish",
    text="Hello, how are you today?"
)

# Now use rendered_prompt with your LLM API
```

## Next Steps

- See the [CLI Usage](./cli_usage.md) guide for command-line operations
- Explore [Advanced Features](./advanced_features.md) for templating and custom metrics
- Check [Integration Examples](./integration_examples.md) for real-world use cases