Introduction
#1
by
dannoncaffeine
- opened
GPT2-124M-wikitext-v0.1
This is a practical hands-on result that helped me build a foundation on π€ Transformers and π€ Datasets. I fint-tuned GPT2 on wikitext (103-raw-v1) on T4. It's an interesting start.