Gpt-2 以及文本生成
This article delves into GPT-2 and Chinese language model resources, exploring various applications such as text generation tools, sensitive content detectors, dialog generation, Twitter generators inspired by influencers, and a massive 130 billion-parameter Tsinghua University model. The article also provides links to testing addresses and repositories for further exploration.
免费gpt文本生成:彩云小梦 以及小梦海外版
小梦的中文文本有涉及政治的检测器 不能把敏感内容塞进小梦
对话生成
https://huggingface.co/thu-coai/CDial-GPT_LCCC-large/tree/main
twitter generator inspired by influencers:
https://github.com/gdemos01/TwitterInfluencerAI
chinese LM
清华130b大模型
测试地址:https://huggingface.co/spaces/THUDM/GLM-130B
模型仓库:https://github.com/THUDM/GLM-130B
https://github.com/Morizeyao/GPT2-Chinese
https://zhuanlan.zhihu.com/p/352028922
https://github.com/TsinghuaAI/CPM-1-Finetune
https://github.com/TsinghuaAI/CPM-1-Generate
https://github.com/TsinghuaAI/CPM-2-Pretrain
gpt2/cpm tutorial:
https://www.cnblogs.com/wwj99/p/12503545.html