Felladrin's picture
Add/update the quantized ONNX model files and README.md for Transformers.js v3 (#1)
b8e1f17 verified
---
library_name: transformers.js
base_model:
- cardiffnlp/tweet-topic-21-multi
---
# tweet-topic-21-multi (ONNX)
This is an ONNX version of [cardiffnlp/tweet-topic-21-multi](https://huggingface.co/cardiffnlp/tweet-topic-21-multi). It was automatically converted and uploaded using [this space](https://huggingface.co/spaces/onnx-community/convert-to-onnx).
## Usage (Transformers.js)
If you haven't already, you can install the [Transformers.js](https://huggingface.co/docs/transformers.js) JavaScript library from [NPM](https://www.npmjs.com/package/@huggingface/transformers) using:
```bash
npm i @huggingface/transformers
```
**Example:** Text Classification with Tweet Topic Model.
```js
import { pipeline } from '@huggingface/transformers';
const classifier = await pipeline('text-classification', 'onnx-community/tweet-topic-21-multi-ONNX');
const output = await classifier('I love transformers!');
```