Aya-8B
Model Description
This is the Aya-8B model, originally designed for Ollama and converted to be compatible with Hugging Face. Aya is an open-source language model known for its conversational abilities and text generation capabilities.
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("Danna8/aya-8b")
model = AutoModelForCausalLM.from_pretrained("Danna8/aya-8b")
inputs = tokenizer("Hello, how are you today?", return_tensors="pt")
outputs = model.generate(inputs["input_ids"], max_length=100)
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(response)
Model Details
- Model Type: Transformer-based language model
- Size: 8 billion parameters
Limitations and Biases
Like all language models, Aya-8B may reproduce biases present in its training data. Users should be aware of these limitations when deploying the model.
License
- Downloads last month
- 17
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support