jbeno commited on
Commit
eef716b
·
1 Parent(s): d45d6f8

Renamed files and classes, updated README instructions to use pip

Browse files
README.md CHANGED
@@ -10,15 +10,8 @@ tags:
10
 
11
  # Electra Base Classifier for Sentiment Analysis
12
 
13
- This model is a fine-tuned [ELECTRA base model](https://huggingface.co/google/electra-base-discriminator) for sentiment analysis, enhanced with custom pooling and a SwishGLU activation function. It classifies text into three sentiment categories: negative, neutral, and positive.
14
 
15
- ## Model Architecture
16
-
17
- - **Base Model**: ELECTRA base (`google/electra-base-discriminator`)
18
- - **Custom Components**:
19
- - **Pooling Layer**: Custom pooling layer supporting 'cls', 'mean', and 'max' pooling types.
20
- - **Activation Function**: Custom SwishGLU activation function.
21
- - **Classifier**: Custom classifier with configurable hidden dimensions, number of layers, and dropout rate.
22
 
23
  ## Labels
24
 
@@ -30,34 +23,148 @@ The model predicts the following labels:
30
 
31
  ## How to Use
32
 
33
- To use this model, you need to download the `electra_base_classifier_sentiment.py` file from this repository and place it in your working directory. This file contains the custom classes required to load and use the model.
 
 
 
 
 
 
34
 
 
35
  ```python
 
 
 
 
36
  import torch
37
  from transformers import AutoTokenizer
38
- from electra_base_classifier_sentiment import ElectraBaseClassifierSentiment # Ensure this file is in your working directory
39
 
 
40
  model_name = "jbeno/electra-base-classifier-sentiment"
41
  tokenizer = AutoTokenizer.from_pretrained(model_name)
42
- model = ElectraBaseClassifierSentiment.from_pretrained(model_name)
 
 
43
  model.eval()
44
 
45
- # Example usage
46
- text = "I love this product!"
47
  inputs = tokenizer(text, return_tensors="pt")
 
48
  with torch.no_grad():
49
  logits = model(**inputs)
50
  predicted_class_id = torch.argmax(logits, dim=1).item()
51
- predicted_label = model.config.id2label[str(predicted_class_id)]
52
  print(f"Predicted label: {predicted_label}")
53
  ```
54
 
55
  ## Requirements
56
  - Python 3.7+
57
  - PyTorch
58
- - Transformers library
59
- - Additional Files:
60
- - Download **electra_base_classifier_sentiment.py** from this repository and place it in your working directory.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
61
 
62
  ## Custom Model Components
63
 
@@ -148,35 +255,6 @@ The model's configuration (config.json) includes custom parameters:
148
  - `dropout_rate`: Dropout rate used in the classifier.
149
  - `pooling`: Pooling strategy used ('mean').
150
 
151
- ## Training Details
152
-
153
- ### Dataset
154
-
155
- The model was trained on the [Sentiment Merged](https://huggingface.co/datasets/jbeno/sentiment_merged) dataset, which is a mix of Stanford Sentiment Treebank (SST-3), DynaSent Round 1, and DynaSent Round 2.
156
-
157
- ### Code
158
-
159
- The code used to train the model can be found on GitHub: [jbeno/sentiment](https://github.com/jbeno/sentiment)
160
-
161
- ### Research Paper
162
-
163
- The research paper can be found here: [ELECTRA and GPT-4o: Cost-Effective Partners for Sentiment Analysis](https://github.com/jbeno/sentiment/research_paper.pdf)
164
-
165
- ### Performance
166
-
167
- - **Merged Dataset**
168
- - Macro Average F1: **79.29**
169
- - Accuracy: **79.69**
170
- - **DynaSent R1**
171
- - Macro Average F1: **82.10**
172
- - Accuracy: **82.14**
173
- - **DynaSent R2**
174
- - Macro Average F1: **71.83**
175
- - Accuracy: **71.94**
176
- - **SST-3**
177
- - Macro Average F1: **69.95**
178
- - Accuracy: **78.24**
179
-
180
  ## License
181
 
182
  This model is licensed under the MIT License.
 
10
 
11
  # Electra Base Classifier for Sentiment Analysis
12
 
13
+ This is an [ELECTRA base discriminator](https://huggingface.co/google/electra-base-discriminator) fine-tuned for sentiment analysis of reviews. It has a mean pooling layer and a classifier head (2 layers of 1024 dimension) with SwishGLU activation and dropout (0.3). It classifies text into three sentiment categories: 'negative' (0), 'neutral' (1), and 'positive' (2). It was fine-tuned on the [Sentiment Merged](https://huggingface.co/datasets/jbeno/sentiment_merged) dataset, which is a merge of Stanford Sentiment Treebank (SST-3), and DynaSent Rounds 1 and 2.
14
 
 
 
 
 
 
 
 
15
 
16
  ## Labels
17
 
 
23
 
24
  ## How to Use
25
 
26
+ ### Install package
27
+
28
+ This model requires the classes in `electra_classifier.py`. You can download the file, or you can install the package from PyPI.
29
+
30
+ ```bash
31
+ pip install electra-classifier
32
+ ```
33
 
34
+ ### Load classes and model
35
  ```python
36
+ # Install the package in a notebook
37
+ !pip install electra-classifier
38
+
39
+ # Import libraries
40
  import torch
41
  from transformers import AutoTokenizer
42
+ from electra_classifier import ElectraClassifier
43
 
44
+ # Load tokenizer and model
45
  model_name = "jbeno/electra-base-classifier-sentiment"
46
  tokenizer = AutoTokenizer.from_pretrained(model_name)
47
+ model = ElectraClassifier.from_pretrained(model_name)
48
+
49
+ # Set model to evaluation mode
50
  model.eval()
51
 
52
+ # Run inference
53
+ text = "I love this restaurant!"
54
  inputs = tokenizer(text, return_tensors="pt")
55
+
56
  with torch.no_grad():
57
  logits = model(**inputs)
58
  predicted_class_id = torch.argmax(logits, dim=1).item()
59
+ predicted_label = model.config.id2label[predicted_class_id]
60
  print(f"Predicted label: {predicted_label}")
61
  ```
62
 
63
  ## Requirements
64
  - Python 3.7+
65
  - PyTorch
66
+ - Transformers
67
+ - [electra-classifier](https://pypi.org/project/electra-classifier/) - Install with pip, or download electra_classifier.py
68
+
69
+ ## Training Details
70
+
71
+ ### Dataset
72
+
73
+ The model was trained on the [Sentiment Merged](https://huggingface.co/datasets/jbeno/sentiment_merged) dataset, which is a mix of Stanford Sentiment Treebank (SST-3), DynaSent Round 1, and DynaSent Round 2.
74
+
75
+ ### Code
76
+
77
+ The code used to train the model can be found on GitHub:
78
+ - [jbeno/sentiment](https://github.com/jbeno/sentiment)
79
+ - [jbeno/electra-classifier](https://github.com/jbeno/electra-classifier)
80
+
81
+ ### Research Paper
82
+
83
+ The research paper can be found here: [ELECTRA and GPT-4o: Cost-Effective Partners for Sentiment Analysis](https://github.com/jbeno/sentiment/research_paper.pdf)
84
+
85
+ ### Performance
86
+
87
+ - **Merged Dataset**
88
+ - Macro Average F1: **79.29**
89
+ - Accuracy: **79.69**
90
+ - **DynaSent R1**
91
+ - Macro Average F1: **82.10**
92
+ - Accuracy: **82.14**
93
+ - **DynaSent R2**
94
+ - Macro Average F1: **71.83**
95
+ - Accuracy: **71.94**
96
+ - **SST-3**
97
+ - Macro Average F1: **69.95**
98
+ - Accuracy: **78.24**
99
+
100
+
101
+ ## Model Architecture
102
+
103
+ - **Base Model**: ELECTRA base discriminator (`google/electra-base-discriminator`)
104
+ - **Pooling Layer**: Custom pooling layer supporting 'cls', 'mean', and 'max' pooling types.
105
+ - **Classifier**: Custom classifier with configurable hidden dimensions, number of layers, and dropout rate.
106
+ - **Activation Function**: Custom SwishGLU activation function.
107
+
108
+ ```
109
+ ElectraClassifier(
110
+ (electra): ElectraModel(
111
+ (embeddings): ElectraEmbeddings(
112
+ (word_embeddings): Embedding(30522, 768, padding_idx=0)
113
+ (position_embeddings): Embedding(512, 768)
114
+ (token_type_embeddings): Embedding(2, 768)
115
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
116
+ (dropout): Dropout(p=0.1, inplace=False)
117
+ )
118
+ (encoder): ElectraEncoder(
119
+ (layer): ModuleList(
120
+ (0-11): 12 x ElectraLayer(
121
+ (attention): ElectraAttention(
122
+ (self): ElectraSelfAttention(
123
+ (query): Linear(in_features=768, out_features=768, bias=True)
124
+ (key): Linear(in_features=768, out_features=768, bias=True)
125
+ (value): Linear(in_features=768, out_features=768, bias=True)
126
+ (dropout): Dropout(p=0.1, inplace=False)
127
+ )
128
+ (output): ElectraSelfOutput(
129
+ (dense): Linear(in_features=768, out_features=768, bias=True)
130
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
131
+ (dropout): Dropout(p=0.1, inplace=False)
132
+ )
133
+ )
134
+ (intermediate): ElectraIntermediate(
135
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
136
+ (intermediate_act_fn): GELUActivation()
137
+ )
138
+ (output): ElectraOutput(
139
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
140
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
141
+ (dropout): Dropout(p=0.1, inplace=False)
142
+ )
143
+ )
144
+ )
145
+ )
146
+ )
147
+ (pooling): PoolingLayer()
148
+ (classifier): Classifier(
149
+ (layers): Sequential(
150
+ (0): Linear(in_features=768, out_features=1024, bias=True)
151
+ (1): SwishGLU(
152
+ (projection): Linear(in_features=1024, out_features=2048, bias=True)
153
+ (activation): SiLU()
154
+ )
155
+ (2): Dropout(p=0.3, inplace=False)
156
+ (3): Linear(in_features=1024, out_features=1024, bias=True)
157
+ (4): SwishGLU(
158
+ (projection): Linear(in_features=1024, out_features=2048, bias=True)
159
+ (activation): SiLU()
160
+ )
161
+ (5): Dropout(p=0.3, inplace=False)
162
+ (6): Linear(in_features=1024, out_features=3, bias=True)
163
+ )
164
+ )
165
+ )
166
+ ```
167
+
168
 
169
  ## Custom Model Components
170
 
 
255
  - `dropout_rate`: Dropout rate used in the classifier.
256
  - `pooling`: Pooling strategy used ('mean').
257
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
258
  ## License
259
 
260
  This model is licensed under the MIT License.
__init__.py CHANGED
@@ -1 +1 @@
1
- from .electra_base_classifier_sentiment import ElectraBaseClassifierSentiment
 
1
+ from .electra_classifier import ElectraClassifier
config.json CHANGED
@@ -1,6 +1,6 @@
1
  {
2
  "architectures": [
3
- "ElectraBaseClassifierSentiment"
4
  ],
5
  "attention_probs_dropout_prob": 0.1,
6
  "classifier_dropout": null,
 
1
  {
2
  "architectures": [
3
+ "ElectraClassifier"
4
  ],
5
  "attention_probs_dropout_prob": 0.1,
6
  "classifier_dropout": null,
electra_base_classifier_sentiment.py → electra_classifier.py RENAMED
@@ -1,23 +1,21 @@
1
  import torch
2
- import torch.nn as nn
3
  from transformers import ElectraPreTrainedModel, ElectraModel
4
 
 
5
  class SwishGLU(nn.Module):
6
  def __init__(self, input_dim: int, output_dim: int):
7
  super(SwishGLU, self).__init__()
8
- # Linear projection to 2 * output_dim to split for gate and projection
9
  self.projection = nn.Linear(input_dim, 2 * output_dim)
10
- self.activation = nn.SiLU() # Swish activation
11
 
12
  def forward(self, x):
13
- # Apply linear projection
14
  x_proj_gate = self.projection(x)
15
- # Split the projection into two parts
16
  projected, gate = x_proj_gate.tensor_split(2, dim=-1)
17
- # Apply Swish activation and multiply
18
  return projected * self.activation(gate)
19
 
20
-
 
21
  class PoolingLayer(nn.Module):
22
  def __init__(self, pooling_type='cls'):
23
  super().__init__()
@@ -35,6 +33,8 @@ class PoolingLayer(nn.Module):
35
  else:
36
  raise ValueError(f"Unknown pooling method: {self.pooling_type}")
37
 
 
 
38
  class Classifier(nn.Module):
39
  def __init__(self, input_dim, hidden_dim, hidden_activation, num_layers, n_classes, dropout_rate=0.0):
40
  super().__init__()
@@ -56,7 +56,9 @@ class Classifier(nn.Module):
56
  def forward(self, x):
57
  return self.layers(x)
58
 
59
- class ElectraBaseClassifierSentiment(ElectraPreTrainedModel):
 
 
60
  def __init__(self, config):
61
  super().__init__(config)
62
  self.electra = ElectraModel(config)
@@ -87,7 +89,6 @@ class ElectraBaseClassifierSentiment(ElectraPreTrainedModel):
87
  )
88
  self.init_weights()
89
 
90
-
91
  def forward(self, input_ids=None, attention_mask=None, **kwargs):
92
  outputs = self.electra(input_ids, attention_mask=attention_mask, **kwargs)
93
  pooled_output = self.pooling(outputs.last_hidden_state, attention_mask)
 
1
  import torch
2
+ from torch import nn
3
  from transformers import ElectraPreTrainedModel, ElectraModel
4
 
5
+ # Custom activation function
6
  class SwishGLU(nn.Module):
7
  def __init__(self, input_dim: int, output_dim: int):
8
  super(SwishGLU, self).__init__()
 
9
  self.projection = nn.Linear(input_dim, 2 * output_dim)
10
+ self.activation = nn.SiLU()
11
 
12
  def forward(self, x):
 
13
  x_proj_gate = self.projection(x)
 
14
  projected, gate = x_proj_gate.tensor_split(2, dim=-1)
 
15
  return projected * self.activation(gate)
16
 
17
+
18
+ # Custom pooling layer
19
  class PoolingLayer(nn.Module):
20
  def __init__(self, pooling_type='cls'):
21
  super().__init__()
 
33
  else:
34
  raise ValueError(f"Unknown pooling method: {self.pooling_type}")
35
 
36
+
37
+ # Custom classifier
38
  class Classifier(nn.Module):
39
  def __init__(self, input_dim, hidden_dim, hidden_activation, num_layers, n_classes, dropout_rate=0.0):
40
  super().__init__()
 
56
  def forward(self, x):
57
  return self.layers(x)
58
 
59
+
60
+ # Custom Electra classifier model
61
+ class ElectraClassifier(ElectraPreTrainedModel):
62
  def __init__(self, config):
63
  super().__init__(config)
64
  self.electra = ElectraModel(config)
 
89
  )
90
  self.init_weights()
91
 
 
92
  def forward(self, input_ids=None, attention_mask=None, **kwargs):
93
  outputs = self.electra(input_ids, attention_mask=attention_mask, **kwargs)
94
  pooled_output = self.pooling(outputs.last_hidden_state, attention_mask)