The Skipped Beat: A Study of Sociopragmatic Understanding in LLMs for 64 Languages
Chiyu Zhang, Khai Duy Doan, Qisheng Liao, Muhammad Abdul-Mageed
The University of British Columbia, Mohamed bin Zayed University of Artificial Intelligence
Publish at Main Conference of EMNLP 2023
Checkpoints of Models Pre-Trained with InfoDCL
We further pretrained XLMR/RoBERTa with InfoDCL framework by (Zhang et al. 2023)
Multilingual Model:
- InfoDCL-XLMR trained with multilingual TweetEmoji-multi: https://huggingface.co/UBC-NLP/InfoDCL-Emoji-XLMR-Base
English Models:
- InfoDCL-RoBERTa trained with TweetEmoji-EN: https://huggingface.co/UBC-NLP/InfoDCL-emoji
- InfoDCL-RoBERTa trained with TweetHashtag-EN: https://huggingface.co/UBC-NLP/InfoDCL-hashtag
Citation
Please cite us if you find our data or models useful.
@inproceedings{zhang-etal-2023-skipped,
title = "The Skipped Beat: A Study of Sociopragmatic Understanding in LLMs for 64 Languages",
author = "Zhang, Chiyu and
Khai Duy Doan and,
Qisheng Liao and,
Abdul-Mageed, Muhammad",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
year = "2023",
publisher = "Association for Computational Linguistics",
}
- Downloads last month
- 9
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.