File size: 2,368 Bytes
dc0da70
 
5cfa4d6
 
1821c28
dc0da70
5cfa4d6
 
 
 
 
 
 
88794fd
5cfa4d6
 
 
 
88794fd
5cfa4d6
 
 
 
 
 
 
 
 
 
 
 
 
88794fd
 
5cfa4d6
 
 
 
 
88794fd
5cfa4d6
 
 
 
 
 
88794fd
5cfa4d6
 
 
 
 
88794fd
5cfa4d6
 
 
88794fd
 
 
 
5cfa4d6
166a7ac
 
5cfa4d6
88794fd
 
 
 
 
5cfa4d6
88794fd
 
5cfa4d6
88794fd
 
5cfa4d6
88794fd
 
5cfa4d6
88794fd
 
5cfa4d6
88794fd
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
---
license: mit
language:
- en
library_name: transformers
---
# Model Card for Model ID

<!-- Provide a quick summary of what the model is/does. -->

This model is used to classify the user-intent for the Danswer project, visit https://github.com/danswer-ai/danswer.

## Model Details
Multiclass classifier on top of distilbert-base-uncased

### Model Description

<!-- Provide a longer summary of what this model is. -->
Classifies user intent of queries into categories including:
0: Keyword Search
1: Semantic Search
2: Direct Question Answering


- **Developed by:** [DanswerAI]
- **License:** [MIT]
- **Finetuned from model [optional]:** [distilbert-base-uncased]

### Model Sources [optional]

<!-- Provide the basic links for the model. -->

- **Repository:** [https://github.com/danswer-ai/danswer]
- **Demo [optional]:** [Upcoming!]

## Uses

<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->

This model is intended to be used in the Danswer Question-Answering System


## Bias, Risks, and Limitations

<!-- This section is meant to convey both technical and sociotechnical limitations. -->

This model has a very small dataset maintained by DanswerAI. If interested, reach out to [email protected].

### Recommendations

<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->

This model is intended to be used in the Danswer (QA System)

## How to Get Started with the Model

```
from transformers import AutoTokenizer
from transformers import TFDistilBertForSequenceClassification
import tensorflow as tf

model = TFDistilBertForSequenceClassification.from_pretrained("danswer/intent-model")
tokenizer = AutoTokenizer.from_pretrained("danswer/intent-model")

class_semantic_mapping = {
        0: "Keyword Search",
        1: "Semantic Search",
        2: "Question Answer"
    }

# Get user input
user_query = "How do I set up Danswer to run on my local environment?"

# Encode the user input
inputs = tokenizer(user_query, return_tensors="tf", truncation=True, padding=True)

# Get model predictions
predictions = model(inputs)[0]

# Get predicted class
predicted_class = tf.math.argmax(predictions, axis=-1)

print(f"Predicted class: {class_semantic_mapping[int(predicted_class)]}")
```