Home

Research

Teaching

Students

Grants

FinBERT

Yi Yang | FinBERT



I maintain a FinBERT project with Prof. Allen Huang. FinBERT, based on Google’s BERT, is a financial domain-specific pre-trained language model trained on 4.9 billion financial text. The goal is to enhance financial NLP research and practice.

Impact: We have presented the FinBERT project to practitioners and regulators, including Hong Kong Monetary Authority (HKMA), the Society of Quantitative Analysts (SQA), AllianceBernstein, J.P. Morgan, and academic conferences in finance and accounting.

Media coverage: We also develop FinSent, a browser-based financial sentiment analysis dashboard of U.S. and Hong Kong firms based on FinBERT. FinSent development is sponsored by HKUST CBSA. FinSent receives media coverage by Hong Kong Economic Times, Wen Wei Po, Headline Daily, Mingpao, BastillePost, and Singtao.

Publication: Huang, Allen H., Hui Wang, and Yi Yang. “FinBERT: A Large Language Model for Extracting Information from Financial Text.” Contemporary Accounting Research (2022).

FinBERT has a family of models for financial NLP tasks. Code samples can be found at FinBERT Github.