MARTINI_enrich_BERTopic_taylorhudak
This is a BERTopic model. BERTopic is a flexible and modular topic modeling framework that allows for the generation of easily interpretable topics from large datasets.
Usage
To use this model, please install BERTopic:
pip install -U bertopic
You can use the model as follows:
from bertopic import BERTopic
topic_model = BERTopic.load("AIDA-UPM/MARTINI_enrich_BERTopic_taylorhudak")
topic_model.get_topic_info()
Topic overview
- Number of topics: 8
- Number of training documents: 564
Click here for an overview of all topics.
Topic ID | Topic Keywords | Topic Frequency | Label |
---|---|---|---|
-1 | vaccine - pandemic - deaths - lies - 2021 | 29 | -1_vaccine_pandemic_deaths_lies |
0 | wikileaks - extradited - julian - supreme - belmarsh | 168 | 0_wikileaks_extradited_julian_supreme |
1 | fauci - symposium - orsolya - catherine - cardiologist | 106 | 1_fauci_symposium_orsolya_catherine |
2 | vaccine - modrna - myocarditis - autopsies - autoimmune | 67 | 2_vaccine_modrna_myocarditis_autopsies |
3 | amtsgericht - bhakdi - conviction - holocaust - vaccination | 56 | 3_amtsgericht_bhakdi_conviction_holocaust |
4 | fed - monetary - coup - ecb - catherine | 51 | 4_fed_monetary_coup_ecb |
5 | mariupol - lugansk - zelensky - vladimir - propaganda | 49 | 5_mariupol_lugansk_zelensky_vladimir |
6 | eu - disinformation - censor - regulatory - headlines | 38 | 6_eu_disinformation_censor_regulatory |
Training hyperparameters
- calculate_probabilities: True
- language: None
- low_memory: False
- min_topic_size: 10
- n_gram_range: (1, 1)
- nr_topics: None
- seed_topic_list: None
- top_n_words: 10
- verbose: False
- zeroshot_min_similarity: 0.7
- zeroshot_topic_list: None
Framework versions
- Numpy: 1.26.4
- HDBSCAN: 0.8.40
- UMAP: 0.5.7
- Pandas: 2.2.3
- Scikit-Learn: 1.5.2
- Sentence-transformers: 3.3.1
- Transformers: 4.46.3
- Numba: 0.60.0
- Plotly: 5.24.1
- Python: 3.10.12
- Downloads last month
- 1