File size: 298 Bytes
57bdca5 |
1 2 3 4 5 6 |
This can be done by directly feeding the sentence to the tokenizer, which leverages the Rust implementation of 🤗 Tokenizers for peak performance. thon inputs = tokenizer(sequence) The tokenizer returns a dictionary with all the arguments necessary for its corresponding model to work properly. |