--- license: mit language: - en library_name: transformers --- # Model Card for Model ID This model is used to classify the user-intent for the Danswer project, visit https://github.com/danswer-ai/danswer. ## Model Details Multiclass classifier on top of distilbert-base-uncased ### Model Description Classifies user intent of queries into categories including: 0: Keyword Search 1: Semantic Search 2: Direct Question Answering - **Developed by:** [DanswerAI] - **License:** [MIT] - **Finetuned from model [optional]:** [distilbert-base-uncased] ### Model Sources [optional] - **Repository:** [https://github.com/danswer-ai/danswer] - **Demo [optional]:** [Upcoming!] ## Uses This model is intended to be used in the Danswer Question-Answering System ## Bias, Risks, and Limitations This model has a very small dataset maintained by DanswerAI. If interested, reach out to danswer.dev@gmail.com. ### Recommendations This model is intended to be used in the Danswer (QA System) ## How to Get Started with the Model ``` from transformers import AutoTokenizer from transformers import TFDistilBertForSequenceClassification import tensorflow as tf model = TFDistilBertForSequenceClassification.from_pretrained("danswer/intent-model") tokenizer = AutoTokenizer.from_pretrained("danswer/intent-model") class_semantic_mapping = { 0: "Keyword Search", 1: "Semantic Search", 2: "Question Answer" } # Get user input user_query = "How do I set up Danswer to run on my local environment?" # Encode the user input inputs = tokenizer(user_query, return_tensors="tf", truncation=True, padding=True) # Get model predictions predictions = model(inputs)[0] # Get predicted class predicted_class = tf.math.argmax(predictions, axis=-1) print(f"Predicted class: {class_semantic_mapping[int(predicted_class)]}") ```