from transformers import AutoModel, AutoTokenizer
from peft import PeftModel
Load the base model and tokenizer
base_model_name = "intfloat/multilingual-e5-small"
tokenizer = AutoTokenizer.from_pretrained(base_model_name)
base_model = AutoModel.from_pretrained(base_model_name)
Load the LoRA adapter directly
adapter_repo = "IslamQA/multilingual-e5-small-finetuned"
model = PeftModel.from_pretrained(base_model, adapter_repo)
Model Card for Model ID
An embedding model optimized for retrieving passages that answer questions about Islam. The passages are inherently multilingual, as they contain quotes from the Quran and Hadith. They often include preambles like "Bismillah" in various languages and follow a specific writing style.
Model Details
Model Sources [optional]
- https://islamqa.info/
- https://islamweb.net/
- https://hadithanswers.com/
- https://askimam.org/
- https://sorularlaislamiyet.com/
Uses
- embedding
- retrieval
- islam
- multilingual
- q&a
from transformers import AutoModel, AutoTokenizer from peft import PeftModel
Load the base model and tokenizer
base_model_name = "intfloat/multilingual-e5-large-instruct" tokenizer = AutoTokenizer.from_pretrained(base_model_name) base_model = AutoModel.from_pretrained(base_model_name)
Load the LoRA adapter directly
adapter_repo = "IslamQA/multilingual-e5-small-finetuned" model = PeftModel.from_pretrained(base_model, adapter_repo)
Model tree for IslamQA/multilingual-e5-small-finetuned
Base model
intfloat/multilingual-e5-small