Model Card for Model ID

๋…ผ๋ฌธ ๋ฒˆ์—ญ์— ํŠนํ™”๋œ ์˜์–ด->ํ•œ๊ตญ์–ด ๋ฒˆ์—ญ ๋ชจ๋ธ์ž…๋‹ˆ๋‹ค.

Model Description

Darong/BlueT ๋ชจ๋ธ์„ ๊ธฐ๋ฐ˜์œผ๋กœ ๋…ผ๋ฌธ์— ๋Œ€ํ•ด ๋ฏธ์„ธ์กฐ์ •ํ•œ ๋ฒˆ์—ญ ๋ชจ๋ธ์ž…๋‹ˆ๋‹ค. ์˜์–ด->ํ•œ๊ตญ์–ด ๋ฒˆ์—ญ์„ ์ง€์›ํ•˜๋ฉฐ, ๋ฒˆ์—ญ ์‹œ ๋†’์ž„๋ง๋„ ์„ค์ •ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

  • Developed by: [BlueAI]
  • Model type: [t5.1.1.base]
  • Language(s) (NLP): [Korean]
  • License: [MIT]
  • Finetuned from model [optional]: [Darong/BlueT]

Uses

How to Get Started with the Model

Use the code below to get started with the model.

from transformers import pipeline, T5TokenizerFast

tokenizer_name = "paust/pko-t5-base"
tokenizer = T5TokenizerFast.from_pretrained(tokenizer_name)
model_path = 'Darong/BluePaper'
translator = pipeline("translation", model=model_path, tokenizer=tokenizer, max_length=255)
# ์˜์–ด -> ํ•œ๊ตญ์–ด
prefix = "E2K: "
source = "This model is an English-Korean translation model."
target = translator(prefix + source)
print(target[0]['translation_text'])
Downloads last month
316
Safetensors
Model size
276M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Darong/BluePaper

Base model

paust/pko-t5-base
Finetuned
Darong/BlueT
Finetuned
(2)
this model