paddlenlp
安装python3.11版本
conda create -n python311 python=3.11
激活python
conda activate python311
安装paddlepaddle
conda install paddlepaddle==3.0.0b0 -c paddle
pip install paddlenlp==3.0.0b0 -u -i https://pypi.tuna.tsinghua.edu.cn/simple
windows下提示:
attributeerror: module ‘mmap’ has no attribute ‘map_private’
解决方法:
e:\anaconda3\envs\python311\lib\site-packages\paddlenlp\utils\safetensors.py
修改280行:
self.file_mmap = mmap.mmap(self.file.fileno(), 0, access=mmap.map_private)
为
self.file_mmap = mmap.mmap(self.file.fileno(), 0, access=mmap.access_read)
错误提示:runtimeerror: (notfound) the kernel with key (cpu, undefined(anylayout), float16) of kernel multiply
is not registered. selected wrong datatype float16
. paddle support following datatypes: complex64, bool, bfloat16, complex128, float32, int32, float64, int64
原因:
在cpu环境调用时,模型支持dtype为float32或者float64;
在gpu环境(非ampere架构)调用时,模型支持dtype为float16、float32或者float64;
在gpu环境(ampere及后续架构)调用时,模型支持dtype为bfloat16、float16、float32或者float64;
测试代码:
import os
from modelscope import snapshot_download
os.environ["hf_endpoint"] = "https://hf-mirror.com"
os.environ["tf_enable_onednn_opts"] = "0"
from paddlenlp.transformers import autotokenizer, automodelforcausallm
model_dir = snapshot_download("qwen/qwen2-0.5b")
tokenizer = autotokenizer.from_pretrained("qwen/qwen2-0.5b",trust_remote_code=true)
model = automodelforcausallm.from_pretrained("qwen/qwen2-0.5b", dtype="float32")
input_features = tokenizer("你好!请自我介绍一下。", return_tensors="pd")
outputs = model.generate(**input_features, max_length=128)
tex=tokenizer.batch_decode(outputs[0])
print(tex)
#['我是一个ai语言模型,我可以回答各种问题,包括但不限于:天气、新闻、历史、文化、科学、教育、娱乐等。请问您有什么需要了解的吗?']
发表评论