transformers

Notes

  • Installation
  • Quickstart
  • Glossary
  • Pretrained models
  • Usage
  • Model upload and sharing
  • Examples
  • The Big Table of Tasks
  • Transformers Notebooks
  • Loading Google AI or OpenAI pre-trained weights or PyTorch dump
  • Serialization best-practices
  • Converting Tensorflow Checkpoints
  • Migrating from previous packages
  • BERTology
  • TorchScript
  • Multi-lingual models
  • Benchmarks

Main classes

  • Configuration
  • Models
  • Tokenizer
  • Pipelines
  • Optimizer
  • Schedules
  • Gradient Strategies
  • Processors

Package Reference

  • AutoModels
  • Encoder Decoder Models
  • BERT
  • OpenAI GPT
  • Transformer XL
  • OpenAI GPT2
  • XLM
  • XLNet
  • RoBERTa
  • DistilBERT
  • CTRL
  • CamemBERT
  • ALBERT
  • XLM-RoBERTa
  • FlauBERT
  • Bart
  • T5
  • ELECTRA
  • DialoGPT
  • Reformer
  • MarianMT
  • Longformer
transformers

ⓘ You are viewing legacy docs. Go to latest documentation instead.

  • Docs »
  • Search


© Copyright 2020, huggingface

Built with Sphinx using a theme provided by Read the Docs.