πŸ€– Intent Detection using Fine-Tuned BERT

This project utilizes a fine-tuned BERT model (bert-base-uncased) for intent classification tasks. It is an encoder-only transformer designed to detect user intents from text inputs (e.g., chatbot queries) and classify them into predefined categories such as banking, travel, finance, and more.

The model is trained on the CLINC150 (clinc_oos) dataset and evaluated using accuracy as the primary metric.


πŸ“Š Dataset --> CLINC150

The project uses the CLINC150 dataset, a benchmark dataset for intent classification in task-oriented dialogue systems.


🧾 Dataset Overview

  • Total intents: 150 unique user intents
  • Domains: 10 real-world domains (e.g., banking, travel, weather, small talk)
  • Examples: ~22,500 utterances
  • Language: English
  • Out-of-scope (OOS): Includes OOS examples to test robustness

πŸ“¦ Source


πŸš€ Example

Request: "I want to book a flight"

Response: "book_flight"


Downloads last month
14
Safetensors
Model size
110M params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for SaherMuhamed/bert-intention-classifier

Finetuned
(5654)
this model

Dataset used to train SaherMuhamed/bert-intention-classifier