Vaccine Tweets Report
Created on October 22|Last edited on October 22
Comment
Language Model Pre-Training
BERT-large-uncased fine-tuned on the following numbers of vaccine-related tweets :
- 500K
- 2M (de-duplicated from 4.7M)
- 3.6M (de-duplicated from 10.9M)
RoBERTa
- 20M (de-duplicated from 64M full dataset)
CT BERT
- 20M
Run set 2
5
Vaccine Sentiment Dataset
Use of fine-tuned models for classification on the Vaccine Sentiment dataset
Run set 2
25
UN Vaccine Dataset
Use of fine-tuned models for classification on the UN Vaccine dataset
Run set
23
Add a comment