File size: 1,534 Bytes
3ecab8a
ed902e2
 
 
 
 
3ecab8a
 
ed902e2
3ecab8a
ed902e2
3ecab8a
ed902e2
3ecab8a
ed902e2
3ecab8a
ed902e2
3ecab8a
ed902e2
3ecab8a
ed902e2
3ecab8a
ed902e2
 
3ecab8a
ed902e2
3ecab8a
ed902e2
 
3ecab8a
ed902e2
 
 
3ecab8a
ed902e2
 
3ecab8a
ed902e2
 
 
 
 
 
 
 
 
3ecab8a
ed902e2
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
---
tags:
- yoda
- chatbot
- conversational
- transformers
---

# Yoda Chatbot Model

This model is fine-tuned to respond like Yoda from Star Wars. It is based on the `microsoft/Phi-3-mini-4k-instruct` model.

## Model Description

This model is designed to generate responses in the style of Yoda. It uses a PEFT (Parameter-Efficient Fine-Tuning) approach with LoRA (Low-Rank Adaptation).

## Intended Use

This model is intended for entertainment purposes and to generate text in the style of Yoda. It should not be used for real-world applications where accurate or sensitive information is required.

## Limitations

- The model's responses are generated based on the input text and may not always be accurate or appropriate.
- The model may produce biased or offensive content, as it is trained on data that could contain such biases.

## Example

```python
from transformers import AutoTokenizer, AutoModelForCausalLM

model_name = "your-username/yoda_chatbot_model"
model = AutoModelForCausalLM.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)

messages = "<s>  You are master Yoda from Star Wars. Answer the questions and chat like him. /n  How many planets are there in the Milky Way? /n"
input_ids = tokenizer(messages, return_tensors="pt")["input_ids"]

# Perform inference
outputs = model.generate(
    input_ids=input_ids,
    max_new_tokens=256,
    do_sample=True,
    temperature=0.7,
    top_k=50,
    top_p=0.95
)

print(tokenizer.batch_decode(outputs, skip_special_tokens=True)[0])