doncamilom commited on
Commit
b94f375
·
1 Parent(s): b76709a

update readme

Browse files
Files changed (1) hide show
  1. README.md +15 -6
README.md CHANGED
@@ -7,9 +7,9 @@ datasets:
7
  - uspto
8
  ---
9
 
10
- # Adapter `doncamilom/OChemSegm-flan-T5-large` for None
11
 
12
- An [adapter](https://adapterhub.ml) for the `None` model that was trained on the [chemistry](https://adapterhub.ml/explore/chemistry/) dataset.
13
 
14
  This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
15
 
@@ -25,10 +25,19 @@ _Note: adapter-transformers is a fork of transformers that acts as a drop-in rep
25
  Now, the adapter can be loaded and activated like this:
26
 
27
  ```python
28
- from transformers import AutoAdapterModel
29
 
30
- model = AutoAdapterModel.from_pretrained("None")
31
- adapter_name = model.load_adapter("doncamilom/OChemSegm-flan-T5-large", source="hf", set_active=True)
 
 
 
 
 
 
 
 
 
32
  ```
33
 
34
  ## Architecture & Training
@@ -41,4 +50,4 @@ adapter_name = model.load_adapter("doncamilom/OChemSegm-flan-T5-large", source="
41
 
42
  ## Citation
43
 
44
- <!-- Add some description here -->
 
7
  - uspto
8
  ---
9
 
10
+ # Adapter `doncamilom/OChemSegm-flan-T5-large` for google/flan-t5-large
11
 
12
+ An [adapter](https://adapterhub.ml) for the `google/flan-t5-large` model that was trained on the [USPTO-segment](www.tobedone.undone) dataset.
13
 
14
  This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
15
 
 
25
  Now, the adapter can be loaded and activated like this:
26
 
27
  ```python
28
+ from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
29
 
30
+ adapter = 'doncamilom/OChemSegm-flan-T5-large'
31
+
32
+ model = AutoModelForSeq2SeqLM.from_pretrained(
33
+ 'google/flan-t5-large',
34
+ )
35
+
36
+ # Load adapter
37
+ adapter_name = self.model.load_adapter(adapter, source='hf', set_active=True)
38
+
39
+ # Load tokenizer
40
+ tokenizer = AutoTokenizer.from_pretrained(adapter)
41
  ```
42
 
43
  ## Architecture & Training
 
50
 
51
  ## Citation
52
 
53
+ <!-- Add some description here -->