Text Classification
Transformers
Safetensors
English
bert
fill-mask
BERT
NeuroBERT
transformer
nlp
neurobert
edge-ai
low-resource
micro-nlp
quantized
iot
wearable-ai
offline-assistant
intent-detection
real-time
smart-home
embedded-systems
command-classification
toy-robotics
voice-ai
eco-ai
english
lightweight
mobile-nlp
ner
Update README.md
Browse files
README.md
CHANGED
@@ -148,7 +148,7 @@ model = AutoModelForSequenceClassification.from_pretrained(model_name)
|
|
148 |
model.eval()
|
149 |
|
150 |
# 🧪 Example input
|
151 |
-
text = "Turn
|
152 |
|
153 |
# ✂️ Tokenize the input
|
154 |
inputs = tokenizer(text, return_tensors="pt")
|
@@ -169,8 +169,8 @@ print(f"Predicted intent: {labels[pred]} (Confidence: {probs[0][pred]:.4f})")
|
|
169 |
|
170 |
**Output**:
|
171 |
```plaintext
|
172 |
-
Text: Turn
|
173 |
-
Predicted intent:
|
174 |
```
|
175 |
|
176 |
*Note*: Fine-tune the model for specific classification tasks to improve accuracy.
|
|
|
148 |
model.eval()
|
149 |
|
150 |
# 🧪 Example input
|
151 |
+
text = "Turn on the fan"
|
152 |
|
153 |
# ✂️ Tokenize the input
|
154 |
inputs = tokenizer(text, return_tensors="pt")
|
|
|
169 |
|
170 |
**Output**:
|
171 |
```plaintext
|
172 |
+
Text: Turn on the fan
|
173 |
+
Predicted intent: ON (Confidence: 0.7824)
|
174 |
```
|
175 |
|
176 |
*Note*: Fine-tune the model for specific classification tasks to improve accuracy.
|