Fill-Mask
Transformers
PyTorch
Joblib
DNA
biology
genomics
custom_code
bernardo-de-almeida commited on
Commit
c0f0359
·
verified ·
1 Parent(s): e59521e

Fix import for latest Transformers compatibility

Browse files

- Updated `modeling_esm.py` to import `find_pruneable_heads_and_indices` from `transformers.utils.modeling_utils` instead of `transformers.modeling_utils`.
- Ensures compatibility with Transformers >=4.30.
- Enables AutoModelForSequenceClassification to load the model without ImportError.

Files changed (1) hide show
  1. modeling_esm.py +2 -5
modeling_esm.py CHANGED
@@ -33,11 +33,8 @@ from transformers.modeling_outputs import (
33
  SequenceClassifierOutput,
34
  TokenClassifierOutput,
35
  )
36
- from transformers.modeling_utils import (
37
- PreTrainedModel,
38
- find_pruneable_heads_and_indices,
39
- prune_linear_layer,
40
- )
41
  from transformers.utils import logging
42
 
43
  from .esm_config import EsmConfig
 
33
  SequenceClassifierOutput,
34
  TokenClassifierOutput,
35
  )
36
+ from transformers.modeling_utils import PreTrainedModel
37
+ from transformers.pytorch_utils import find_pruneable_heads_and_indices, prune_linear_layer
 
 
 
38
  from transformers.utils import logging
39
 
40
  from .esm_config import EsmConfig