import transformers import torch

model_id = "meta-llama/Meta-Llama-3.1-8B"

pipeline = transformers.pipeline( "text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto" )

pipeline("Hey how are you doing today?")

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for divjotst/llama_integration_EWS

Finetuned
(1601)
this model