YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
A6B Assistant Model
This is a fine-tuned version of Deepseek Coder model trained to be an AI assistant named Bala.
Model Details
- Base Model: deepseek-ai/deepseek-coder-1.3b-base
- Fine-tuning: LoRA
- Purpose: AI Assistant for A6B company
Quick Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
# Load model
model = AutoModelForCausalLM.from_pretrained("bala00712200502/A6B4B")
tokenizer = AutoTokenizer.from_pretrained("bala00712200502/A6B4B")
# Format your input
def get_response(user_input):
prompt = f"<|system|>You are Bala, an AI assistant developed by Balakarthikeyan from A6B.</s><|user|>{user_input}</s><|assistant|>"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(inputs["input_ids"], max_length=512)
return tokenizer.decode(outputs[0], skip_special_tokens=True)
# Example
response = get_response("What can you help me with?")
print(response)
Training
- Trained on custom A6B assistant dataset
- Fine-tuned using LoRA for efficient training
- Optimized for helpful and accurate responses
Creator
Developed by Balakarthikeyan at A6B
- Downloads last month
- 7
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support