This tutorial provides an overview of different natural language processing (NLP) techniques for text classification. It covers a range of methods, including BERT-NER, ALBERT-NER, GPT2-generation, BiLSTM+Attention, TextCNN, and TextGCN. These techniques are applicable to both Chinese and English languages, making it a valuable resource for developers working with diverse language datasets.

GAN for NLP text generation

GAN Journey:

https://github.com/nutllwhy/gan-journey

NLPGNN:

https://github.com/kyzhouhzau/NLPGNN

Examples (See tests for more details):

BERT-NER (Chinese and English Version)

BERT-CRF-NER (Chinese and English Version)

BERT-CLS (Chinese and English Version)

ALBERT-NER (Chinese and English Version)

ALBERT-CLS (Chinese and English Version)

GPT2-generation (English Version)

Bilstm+Attention (Chinese and English Version)

TextCNN(Chinese and English Version)

GCN, GAN, GIN, GraphSAGE (Base on message passing)

TextGCN and TextSAGE for text classification

Comments