Edit model card

Moss-base-7b是一个70亿参数量的预训练语言模型,可以作为基座模型用来进行SFT训练等。

Import from Transformers

To load the Moss 7B model using Transformers, use the following code:

>>> from transformers import AutoTokenizer, AutoModelForCausalLM
>>> tokenizer = AutoTokenizer.from_pretrained("fnlp/moss-base-7b", trust_remote_code=True)
>>> model = AutoModelForCausalLM.from_pretrained("fnlp/moss-base-7b", trust_remote_code=True).cuda()
>>> model = model.eval()
>>> inputs = tokenizer(["流浪地球的导演是"], return_tensors="pt")
>>> for k,v in inputs.items():
        inputs[k] = v.cuda()
>>> outputs = model.generate(**inputs, do_sample=True, temperature=0.8, top_p=0.8, repetition_penalty=1.1, max_new_tokens=256)
>>> response = tokenizer.decode(outputs[0][inputs.input_ids.shape[1]:], skip_special_tokens=True)
>>> print(response)
郭帆
主演分别是吴京和屈楚萧 还有李光洁刘德华等等
这电影可以说是目前国内科幻片的天花板了
票房也是突破50亿大关啦
小编真的非常期待这部电影呀
所以呢今天就给大家整理了关于影片中的很多细节图哦~
不知道大家有没有注意到呢
Downloads last month
41
Inference Examples
Inference API (serverless) does not yet support model repos that contain custom code.

Space using fnlp/moss-base-7b 1