Individual Submission Summary
Share...

Direct link:

PolBERT: A Suite of Pre-Trained Large Language Models for Political Science Research

Sat, April 15, 4:45 to 6:15pm CDT (4:45 to 6:15pm CDT), TBA

Brief Overview

We introduce PolBERT, a set of transformer models pre-trained on political text corpora, and show how domain-specific pre-training improves classification performance.

Authors