File size: 1,580 Bytes
54b8932
 
 
 
 
 
 
 
 
 
 
 
e5dcd00
772894c
 
 
 
dfa5254
 
 
 
 
 
772894c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f6287fb
 
 
 
 
 
 
772894c
7a1179a
054414b
f15901e
 
 
 
054414b
 
7a1179a
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
---
license: mit
datasets:
- Abhishekcr448/Hinglish-Everyday-Conversations-1M
language:
- hi
- en
base_model:
- google/gemma-3-1b-it
pipeline_tag: text-generation
tags:
- hinglish
library_name: keras-hub
---

# RLM_hingu

<p align="center">
  <a href="https://www.buymeacoffee.com/rudrashah" target="_blank">
    <img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" style="height: 60px !important;width: 217px !important;">
  </a>
</p>

**RLM_hingu** is a fine-tuned version of the [Gemma-3B Instruct](https://huggingface.co/google/gemma-1.1-1b-it) model, adapted for casual Hinglish (Hindi-English) conversation using the `keras-nlp` framework. It is designed for lightweight conversational tasks in Hinglish, optimized with the `JAX` backend for efficiency.


## Model Overview

- **Base model**: `gemma3_instruct_1b`
- **Library**: [`keras-nlp`](https://github.com/keras-team/keras-nlp)
- **Backend**: JAX (recommended for best performance)
- **Sampling Method**: Top-K (k=10)
- **Use Case**: Conversational Hinglish response generation

## Usage

``` python
from keras_nlp.models import Gemma3CausalLM
from keras_nlp.samplers import TopKSampler

model = Gemma3CausalLM.from_preset("hf://rudrashah/RLM_hingu")
template = "Question:\n{question}\n\nAnswer:\n{answer}"
prompt = template.format(
    question="Rudra acha ladka hai?",
    answer="",
)
output = model.generate(prompt, max_length=256)
print(output)
```
```output
Question:
Rudra acha ladka hai?

Answer:
haan, sabse best hai.
```
To run RLM_hingu, just paste this code and wait.