|
--- |
|
license: mit |
|
datasets: |
|
- Abhishekcr448/Hinglish-Everyday-Conversations-1M |
|
language: |
|
- hi |
|
- en |
|
base_model: |
|
- google/gemma-3-1b-it |
|
pipeline_tag: text-generation |
|
tags: |
|
- hinglish |
|
library_name: keras-hub |
|
--- |
|
|
|
# RLM_hingu |
|
|
|
<p align="center"> |
|
<a href="https://www.buymeacoffee.com/rudrashah" target="_blank"> |
|
<img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" style="height: 60px !important;width: 217px !important;"> |
|
</a> |
|
</p> |
|
|
|
**RLM_hingu** is a fine-tuned version of the [Gemma-3B Instruct](https://huggingface.co/google/gemma-1.1-1b-it) model, adapted for casual Hinglish (Hindi-English) conversation using the `keras-nlp` framework. It is designed for lightweight conversational tasks in Hinglish, optimized with the `JAX` backend for efficiency. |
|
|
|
|
|
## Model Overview |
|
|
|
- **Base model**: `gemma3_instruct_1b` |
|
- **Library**: [`keras-nlp`](https://github.com/keras-team/keras-nlp) |
|
- **Backend**: JAX (recommended for best performance) |
|
- **Sampling Method**: Top-K (k=10) |
|
- **Use Case**: Conversational Hinglish response generation |
|
|
|
## Usage |
|
|
|
``` python |
|
from keras_nlp.models import Gemma3CausalLM |
|
from keras_nlp.samplers import TopKSampler |
|
|
|
model = Gemma3CausalLM.from_preset("hf://rudrashah/RLM_hingu") |
|
template = "Question:\n{question}\n\nAnswer:\n{answer}" |
|
prompt = template.format( |
|
question="Rudra acha ladka hai?", |
|
answer="", |
|
) |
|
output = model.generate(prompt, max_length=256) |
|
print(output) |
|
``` |
|
```output |
|
Question: |
|
Rudra acha ladka hai? |
|
|
|
Answer: |
|
haan, sabse best hai. |
|
``` |
|
To run RLM_hingu, just paste this code and wait. |