suriya7 commited on
Commit
44f5e16
·
verified ·
1 Parent(s): 1f76003

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +32 -1
README.md CHANGED
@@ -14,4 +14,35 @@ inference:
14
  datasets:
15
  - databricks/databricks-dolly-15k
16
  - VMware/open-instruct
17
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
14
  datasets:
15
  - databricks/databricks-dolly-15k
16
  - VMware/open-instruct
17
+ ---
18
+ ## MaxMini-Instruct-248M
19
+ # Overview
20
+ MaxMini-Instruct-248M is a T5 (Text-To-Text Transfer Transformer) model fine-tuned on a variety of tasks. This model is designed to perform a range of instructional tasks, enabling users to generate instructions for various inputs.
21
+
22
+ ## Model Details
23
+ - Model Name: MaxMini-Instruct-248M
24
+ - Model Type: T5 (Text-To-Text Transfer Transformer)
25
+ - Model Size: 248M parameters
26
+ - Fine-tuned on: Instructional tasks
27
+ ## Usage
28
+ #### Installation
29
+ You can install the model via the Hugging Face library:
30
+ ```bash
31
+ pip install transformers
32
+ pip install torch
33
+ ```
34
+ ## Inference
35
+ ```python
36
+ from transformers import T5ForConditionalGeneration, T5Tokenizer
37
+
38
+ model_name = "MaxMini-Instruct-248M"
39
+
40
+ my_question = "what is depression?"
41
+ inputs = "Please answer to this question: " + my_question
42
+
43
+ inputs = tokenizer(inputs, return_tensors="pt"
44
+ )
45
+
46
+ generated_ids = model.generate(**inputs, max_new_tokens=250,do_sample=True)
47
+ decoded = tokenizer.decode(generated_ids[0], skip_special_tokens=True)
48
+ print(f"Generated Output: {decoded}")