tomaarsen HF Staff commited on
Commit
aab7c05
·
verified ·
1 Parent(s): 06bb80f

Add new CrossEncoder model

Browse files
openvino/openvino_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3d8e9b82bb01146aad23dc6c274615affb4543b1be8f6a88d003f452b44d7a2f
3
+ size 17548448
openvino/openvino_model.xml ADDED
@@ -0,0 +1,2829 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <?xml version="1.0"?>
2
+ <net name="Model1210" version="11">
3
+ <layers>
4
+ <layer id="2" name="input_ids" type="Parameter" version="opset1">
5
+ <data shape="?,?" element_type="i64" />
6
+ <output>
7
+ <port id="0" precision="I64" names="input_ids">
8
+ <dim>-1</dim>
9
+ <dim>-1</dim>
10
+ </port>
11
+ </output>
12
+ </layer>
13
+ <layer id="1" name="attention_mask" type="Parameter" version="opset1">
14
+ <data shape="?,?" element_type="i64" />
15
+ <output>
16
+ <port id="0" precision="I64" names="attention_mask">
17
+ <dim>-1</dim>
18
+ <dim>-1</dim>
19
+ </port>
20
+ </output>
21
+ </layer>
22
+ <layer id="0" name="token_type_ids" type="Parameter" version="opset1">
23
+ <data shape="?,?" element_type="i64" />
24
+ <output>
25
+ <port id="0" precision="I64" names="token_type_ids">
26
+ <dim>-1</dim>
27
+ <dim>-1</dim>
28
+ </port>
29
+ </output>
30
+ </layer>
31
+ <layer id="3" name="self.bert.embeddings.word_embeddings.weight" type="Const" version="opset1">
32
+ <data element_type="f32" shape="30522, 128" offset="0" size="15627264" />
33
+ <output>
34
+ <port id="0" precision="FP32" names="self.bert.embeddings.word_embeddings.weight">
35
+ <dim>30522</dim>
36
+ <dim>128</dim>
37
+ </port>
38
+ </output>
39
+ </layer>
40
+ <layer id="4" name="__module.bert.embeddings.word_embeddings/aten::embedding/Convert" type="Convert" version="opset1">
41
+ <data destination_type="i32" />
42
+ <input>
43
+ <port id="0" precision="I64">
44
+ <dim>-1</dim>
45
+ <dim>-1</dim>
46
+ </port>
47
+ </input>
48
+ <output>
49
+ <port id="1" precision="I32">
50
+ <dim>-1</dim>
51
+ <dim>-1</dim>
52
+ </port>
53
+ </output>
54
+ </layer>
55
+ <layer id="5" name="__module.bert.embeddings.word_embeddings/aten::embedding/Constant" type="Const" version="opset1">
56
+ <data element_type="i32" shape="" offset="15627264" size="4" />
57
+ <output>
58
+ <port id="0" precision="I32" />
59
+ </output>
60
+ </layer>
61
+ <layer id="6" name="__module.bert.embeddings.word_embeddings/aten::embedding/Gather" type="Gather" version="opset8">
62
+ <data batch_dims="0" />
63
+ <input>
64
+ <port id="0" precision="FP32">
65
+ <dim>30522</dim>
66
+ <dim>128</dim>
67
+ </port>
68
+ <port id="1" precision="I32">
69
+ <dim>-1</dim>
70
+ <dim>-1</dim>
71
+ </port>
72
+ <port id="2" precision="I32" />
73
+ </input>
74
+ <output>
75
+ <port id="3" precision="FP32" names="47,inputs_embeds">
76
+ <dim>-1</dim>
77
+ <dim>-1</dim>
78
+ <dim>128</dim>
79
+ </port>
80
+ </output>
81
+ </layer>
82
+ <layer id="7" name="self.bert.embeddings.token_type_embeddings.weight" type="Const" version="opset1">
83
+ <data element_type="f32" shape="2, 128" offset="15627268" size="1024" />
84
+ <output>
85
+ <port id="0" precision="FP32" names="self.bert.embeddings.token_type_embeddings.weight">
86
+ <dim>2</dim>
87
+ <dim>128</dim>
88
+ </port>
89
+ </output>
90
+ </layer>
91
+ <layer id="8" name="__module.bert.embeddings.token_type_embeddings/aten::embedding/Convert" type="Convert" version="opset1">
92
+ <data destination_type="i32" />
93
+ <input>
94
+ <port id="0" precision="I64">
95
+ <dim>-1</dim>
96
+ <dim>-1</dim>
97
+ </port>
98
+ </input>
99
+ <output>
100
+ <port id="1" precision="I32">
101
+ <dim>-1</dim>
102
+ <dim>-1</dim>
103
+ </port>
104
+ </output>
105
+ </layer>
106
+ <layer id="9" name="__module.bert.embeddings.token_type_embeddings/aten::embedding/Constant" type="Const" version="opset1">
107
+ <data element_type="i32" shape="" offset="15627264" size="4" />
108
+ <output>
109
+ <port id="0" precision="I32" />
110
+ </output>
111
+ </layer>
112
+ <layer id="10" name="__module.bert.embeddings.token_type_embeddings/aten::embedding/Gather" type="Gather" version="opset8">
113
+ <data batch_dims="0" />
114
+ <input>
115
+ <port id="0" precision="FP32">
116
+ <dim>2</dim>
117
+ <dim>128</dim>
118
+ </port>
119
+ <port id="1" precision="I32">
120
+ <dim>-1</dim>
121
+ <dim>-1</dim>
122
+ </port>
123
+ <port id="2" precision="I32" />
124
+ </input>
125
+ <output>
126
+ <port id="3" precision="FP32" names="49,token_type_embeddings.1">
127
+ <dim>-1</dim>
128
+ <dim>-1</dim>
129
+ <dim>128</dim>
130
+ </port>
131
+ </output>
132
+ </layer>
133
+ <layer id="11" name="__module.bert.embeddings/aten::add/Add" type="Add" version="opset1">
134
+ <data auto_broadcast="numpy" />
135
+ <input>
136
+ <port id="0" precision="FP32">
137
+ <dim>-1</dim>
138
+ <dim>-1</dim>
139
+ <dim>128</dim>
140
+ </port>
141
+ <port id="1" precision="FP32">
142
+ <dim>-1</dim>
143
+ <dim>-1</dim>
144
+ <dim>128</dim>
145
+ </port>
146
+ </input>
147
+ <output>
148
+ <port id="2" precision="FP32" names="50_1">
149
+ <dim>-1</dim>
150
+ <dim>-1</dim>
151
+ <dim>128</dim>
152
+ </port>
153
+ </output>
154
+ </layer>
155
+ <layer id="12" name="self.bert.embeddings.position_embeddings.weight" type="Const" version="opset1">
156
+ <data element_type="f32" shape="512, 128" offset="15628292" size="262144" />
157
+ <output>
158
+ <port id="0" precision="FP32" names="self.bert.embeddings.position_embeddings.weight">
159
+ <dim>512</dim>
160
+ <dim>128</dim>
161
+ </port>
162
+ </output>
163
+ </layer>
164
+ <layer id="13" name="__module.bert.embeddings/aten::slice/Slice" type="Const" version="opset1">
165
+ <data element_type="i64" shape="1, 512" offset="15890436" size="4096" />
166
+ <output>
167
+ <port id="0" precision="I64" names="44">
168
+ <dim>1</dim>
169
+ <dim>512</dim>
170
+ </port>
171
+ </output>
172
+ </layer>
173
+ <layer id="14" name="__module.bert.embeddings/aten::slice/Reshape" type="Const" version="opset1">
174
+ <data element_type="i64" shape="1" offset="15894532" size="8" />
175
+ <output>
176
+ <port id="0" precision="I64">
177
+ <dim>1</dim>
178
+ </port>
179
+ </output>
180
+ </layer>
181
+ <layer id="15" name="ShapeOf_996375" type="ShapeOf" version="opset3">
182
+ <data output_type="i64" />
183
+ <input>
184
+ <port id="0" precision="I64">
185
+ <dim>-1</dim>
186
+ <dim>-1</dim>
187
+ </port>
188
+ </input>
189
+ <output>
190
+ <port id="1" precision="I64">
191
+ <dim>2</dim>
192
+ </port>
193
+ </output>
194
+ </layer>
195
+ <layer id="16" name="Constant_996490" type="Const" version="opset1">
196
+ <data element_type="i64" shape="1" offset="15894540" size="8" />
197
+ <output>
198
+ <port id="0" precision="I64">
199
+ <dim>1</dim>
200
+ </port>
201
+ </output>
202
+ </layer>
203
+ <layer id="17" name="Constant_996377" type="Const" version="opset1">
204
+ <data element_type="i64" shape="" offset="15894532" size="8" />
205
+ <output>
206
+ <port id="0" precision="I64" />
207
+ </output>
208
+ </layer>
209
+ <layer id="18" name="Gather_996378" type="Gather" version="opset8">
210
+ <data batch_dims="0" />
211
+ <input>
212
+ <port id="0" precision="I64">
213
+ <dim>2</dim>
214
+ </port>
215
+ <port id="1" precision="I64">
216
+ <dim>1</dim>
217
+ </port>
218
+ <port id="2" precision="I64" />
219
+ </input>
220
+ <output>
221
+ <port id="3" precision="I64" names="34,40,42,43,60">
222
+ <dim>1</dim>
223
+ </port>
224
+ </output>
225
+ </layer>
226
+ <layer id="19" name="__module.bert.embeddings/aten::slice/Reshape_2" type="Const" version="opset1">
227
+ <data element_type="i64" shape="1" offset="15894540" size="8" />
228
+ <output>
229
+ <port id="0" precision="I64">
230
+ <dim>1</dim>
231
+ </port>
232
+ </output>
233
+ </layer>
234
+ <layer id="20" name="__module.bert.embeddings/aten::slice/Reshape_3" type="Const" version="opset1">
235
+ <data element_type="i64" shape="1" offset="15894540" size="8" />
236
+ <output>
237
+ <port id="0" precision="I64">
238
+ <dim>1</dim>
239
+ </port>
240
+ </output>
241
+ </layer>
242
+ <layer id="21" name="__module.bert.embeddings/aten::slice/Slice_1" type="Slice" version="opset8">
243
+ <input>
244
+ <port id="0" precision="I64">
245
+ <dim>1</dim>
246
+ <dim>512</dim>
247
+ </port>
248
+ <port id="1" precision="I64">
249
+ <dim>1</dim>
250
+ </port>
251
+ <port id="2" precision="I64">
252
+ <dim>1</dim>
253
+ </port>
254
+ <port id="3" precision="I64">
255
+ <dim>1</dim>
256
+ </port>
257
+ <port id="4" precision="I64">
258
+ <dim>1</dim>
259
+ </port>
260
+ </input>
261
+ <output>
262
+ <port id="5" precision="I64" names="45">
263
+ <dim>1</dim>
264
+ <dim>-1</dim>
265
+ </port>
266
+ </output>
267
+ </layer>
268
+ <layer id="22" name="__module.bert.embeddings.position_embeddings/aten::embedding/Convert" type="Convert" version="opset1">
269
+ <data destination_type="i32" />
270
+ <input>
271
+ <port id="0" precision="I64">
272
+ <dim>1</dim>
273
+ <dim>-1</dim>
274
+ </port>
275
+ </input>
276
+ <output>
277
+ <port id="1" precision="I32">
278
+ <dim>1</dim>
279
+ <dim>-1</dim>
280
+ </port>
281
+ </output>
282
+ </layer>
283
+ <layer id="23" name="__module.bert.embeddings.position_embeddings/aten::embedding/Constant" type="Const" version="opset1">
284
+ <data element_type="i32" shape="" offset="15627264" size="4" />
285
+ <output>
286
+ <port id="0" precision="I32" />
287
+ </output>
288
+ </layer>
289
+ <layer id="24" name="__module.bert.embeddings.position_embeddings/aten::embedding/Gather" type="Gather" version="opset8">
290
+ <data batch_dims="0" />
291
+ <input>
292
+ <port id="0" precision="FP32">
293
+ <dim>512</dim>
294
+ <dim>128</dim>
295
+ </port>
296
+ <port id="1" precision="I32">
297
+ <dim>1</dim>
298
+ <dim>-1</dim>
299
+ </port>
300
+ <port id="2" precision="I32" />
301
+ </input>
302
+ <output>
303
+ <port id="3" precision="FP32" names="52,position_embeddings.1">
304
+ <dim>1</dim>
305
+ <dim>-1</dim>
306
+ <dim>128</dim>
307
+ </port>
308
+ </output>
309
+ </layer>
310
+ <layer id="25" name="__module.bert.embeddings/aten::add_/Add" type="Add" version="opset1">
311
+ <data auto_broadcast="numpy" />
312
+ <input>
313
+ <port id="0" precision="FP32">
314
+ <dim>-1</dim>
315
+ <dim>-1</dim>
316
+ <dim>128</dim>
317
+ </port>
318
+ <port id="1" precision="FP32">
319
+ <dim>1</dim>
320
+ <dim>-1</dim>
321
+ <dim>128</dim>
322
+ </port>
323
+ </input>
324
+ <output>
325
+ <port id="2" precision="FP32" names="50,embeddings.1">
326
+ <dim>-1</dim>
327
+ <dim>-1</dim>
328
+ <dim>128</dim>
329
+ </port>
330
+ </output>
331
+ </layer>
332
+ <layer id="26" name="__module.bert.embeddings.LayerNorm/aten::layer_norm/Multiply" type="Const" version="opset1">
333
+ <data element_type="i32" shape="1" offset="15894548" size="4" />
334
+ <output>
335
+ <port id="0" precision="I32">
336
+ <dim>1</dim>
337
+ </port>
338
+ </output>
339
+ </layer>
340
+ <layer id="27" name="__module.bert.embeddings.LayerNorm/aten::layer_norm/MVN" type="MVN" version="opset6">
341
+ <data eps="9.999999960041972e-13" normalize_variance="true" eps_mode="INSIDE_SQRT" />
342
+ <input>
343
+ <port id="0" precision="FP32">
344
+ <dim>-1</dim>
345
+ <dim>-1</dim>
346
+ <dim>128</dim>
347
+ </port>
348
+ <port id="1" precision="I32">
349
+ <dim>1</dim>
350
+ </port>
351
+ </input>
352
+ <output>
353
+ <port id="2" precision="FP32">
354
+ <dim>-1</dim>
355
+ <dim>-1</dim>
356
+ <dim>128</dim>
357
+ </port>
358
+ </output>
359
+ </layer>
360
+ <layer id="28" name="Constant_996317" type="Const" version="opset1">
361
+ <data element_type="f32" shape="1, 1, 128" offset="15894552" size="512" />
362
+ <output>
363
+ <port id="0" precision="FP32">
364
+ <dim>1</dim>
365
+ <dim>1</dim>
366
+ <dim>128</dim>
367
+ </port>
368
+ </output>
369
+ </layer>
370
+ <layer id="29" name="__module.bert.embeddings.LayerNorm/aten::layer_norm/Multiply_1" type="Multiply" version="opset1">
371
+ <data auto_broadcast="numpy" />
372
+ <input>
373
+ <port id="0" precision="FP32">
374
+ <dim>-1</dim>
375
+ <dim>-1</dim>
376
+ <dim>128</dim>
377
+ </port>
378
+ <port id="1" precision="FP32">
379
+ <dim>1</dim>
380
+ <dim>1</dim>
381
+ <dim>128</dim>
382
+ </port>
383
+ </input>
384
+ <output>
385
+ <port id="2" precision="FP32">
386
+ <dim>-1</dim>
387
+ <dim>-1</dim>
388
+ <dim>128</dim>
389
+ </port>
390
+ </output>
391
+ </layer>
392
+ <layer id="30" name="Constant_996318" type="Const" version="opset1">
393
+ <data element_type="f32" shape="1, 1, 128" offset="15895064" size="512" />
394
+ <output>
395
+ <port id="0" precision="FP32">
396
+ <dim>1</dim>
397
+ <dim>1</dim>
398
+ <dim>128</dim>
399
+ </port>
400
+ </output>
401
+ </layer>
402
+ <layer id="31" name="__module.bert.embeddings.LayerNorm/aten::layer_norm/Add" type="Add" version="opset1">
403
+ <data auto_broadcast="numpy" />
404
+ <input>
405
+ <port id="0" precision="FP32">
406
+ <dim>-1</dim>
407
+ <dim>-1</dim>
408
+ <dim>128</dim>
409
+ </port>
410
+ <port id="1" precision="FP32">
411
+ <dim>1</dim>
412
+ <dim>1</dim>
413
+ <dim>128</dim>
414
+ </port>
415
+ </input>
416
+ <output>
417
+ <port id="2" precision="FP32" names="57,input.1">
418
+ <dim>-1</dim>
419
+ <dim>-1</dim>
420
+ <dim>128</dim>
421
+ </port>
422
+ </output>
423
+ </layer>
424
+ <layer id="32" name="self.bert.encoder.layer.0.attention.self.query.weight" type="Const" version="opset1">
425
+ <data element_type="f32" shape="128, 128" offset="15895576" size="65536" />
426
+ <output>
427
+ <port id="0" precision="FP32" names="self.bert.encoder.layer.0.attention.self.query.weight">
428
+ <dim>128</dim>
429
+ <dim>128</dim>
430
+ </port>
431
+ </output>
432
+ </layer>
433
+ <layer id="33" name="__module.bert.encoder.layer.0.attention.self.query/aten::linear/MatMul" type="MatMul" version="opset1">
434
+ <data transpose_a="false" transpose_b="true" />
435
+ <input>
436
+ <port id="0" precision="FP32">
437
+ <dim>-1</dim>
438
+ <dim>-1</dim>
439
+ <dim>128</dim>
440
+ </port>
441
+ <port id="1" precision="FP32">
442
+ <dim>128</dim>
443
+ <dim>128</dim>
444
+ </port>
445
+ </input>
446
+ <output>
447
+ <port id="2" precision="FP32">
448
+ <dim>-1</dim>
449
+ <dim>-1</dim>
450
+ <dim>128</dim>
451
+ </port>
452
+ </output>
453
+ </layer>
454
+ <layer id="34" name="Constant_996319" type="Const" version="opset1">
455
+ <data element_type="f32" shape="1, 1, 128" offset="15961112" size="512" />
456
+ <output>
457
+ <port id="0" precision="FP32">
458
+ <dim>1</dim>
459
+ <dim>1</dim>
460
+ <dim>128</dim>
461
+ </port>
462
+ </output>
463
+ </layer>
464
+ <layer id="35" name="__module.bert.encoder.layer.0.attention.self.query/aten::linear/Add" type="Add" version="opset1">
465
+ <data auto_broadcast="numpy" />
466
+ <input>
467
+ <port id="0" precision="FP32">
468
+ <dim>-1</dim>
469
+ <dim>-1</dim>
470
+ <dim>128</dim>
471
+ </port>
472
+ <port id="1" precision="FP32">
473
+ <dim>1</dim>
474
+ <dim>1</dim>
475
+ <dim>128</dim>
476
+ </port>
477
+ </input>
478
+ <output>
479
+ <port id="2" precision="FP32" names="87,x.1">
480
+ <dim>-1</dim>
481
+ <dim>-1</dim>
482
+ <dim>128</dim>
483
+ </port>
484
+ </output>
485
+ </layer>
486
+ <layer id="36" name="__module.bert.encoder.layer.0.attention.self/prim::ListConstruct/Concat" type="Const" version="opset1">
487
+ <data element_type="i64" shape="4" offset="15961624" size="32" />
488
+ <output>
489
+ <port id="0" precision="I64">
490
+ <dim>4</dim>
491
+ </port>
492
+ </output>
493
+ </layer>
494
+ <layer id="37" name="__module.bert.encoder.layer.0.attention.self/aten::view/Reshape" type="Reshape" version="opset1">
495
+ <data special_zero="true" />
496
+ <input>
497
+ <port id="0" precision="FP32">
498
+ <dim>-1</dim>
499
+ <dim>-1</dim>
500
+ <dim>128</dim>
501
+ </port>
502
+ <port id="1" precision="I64">
503
+ <dim>4</dim>
504
+ </port>
505
+ </input>
506
+ <output>
507
+ <port id="2" precision="FP32" names="91,x.3">
508
+ <dim>-1</dim>
509
+ <dim>-1</dim>
510
+ <dim>2</dim>
511
+ <dim>64</dim>
512
+ </port>
513
+ </output>
514
+ </layer>
515
+ <layer id="38" name="Constant_992547" type="Const" version="opset1">
516
+ <data element_type="i64" shape="4" offset="15961656" size="32" />
517
+ <output>
518
+ <port id="0" precision="I64" names="92">
519
+ <dim>4</dim>
520
+ </port>
521
+ </output>
522
+ </layer>
523
+ <layer id="39" name="__module.bert.encoder.layer.0.attention.self/aten::permute/Transpose" type="Transpose" version="opset1">
524
+ <input>
525
+ <port id="0" precision="FP32">
526
+ <dim>-1</dim>
527
+ <dim>-1</dim>
528
+ <dim>2</dim>
529
+ <dim>64</dim>
530
+ </port>
531
+ <port id="1" precision="I64">
532
+ <dim>4</dim>
533
+ </port>
534
+ </input>
535
+ <output>
536
+ <port id="2" precision="FP32" names="93">
537
+ <dim>-1</dim>
538
+ <dim>2</dim>
539
+ <dim>-1</dim>
540
+ <dim>64</dim>
541
+ </port>
542
+ </output>
543
+ </layer>
544
+ <layer id="40" name="self.bert.encoder.layer.0.attention.self.key.weight" type="Const" version="opset1">
545
+ <data element_type="f32" shape="128, 128" offset="15961688" size="65536" />
546
+ <output>
547
+ <port id="0" precision="FP32" names="self.bert.encoder.layer.0.attention.self.key.weight">
548
+ <dim>128</dim>
549
+ <dim>128</dim>
550
+ </port>
551
+ </output>
552
+ </layer>
553
+ <layer id="41" name="__module.bert.encoder.layer.0.attention.self.key/aten::linear/MatMul" type="MatMul" version="opset1">
554
+ <data transpose_a="false" transpose_b="true" />
555
+ <input>
556
+ <port id="0" precision="FP32">
557
+ <dim>-1</dim>
558
+ <dim>-1</dim>
559
+ <dim>128</dim>
560
+ </port>
561
+ <port id="1" precision="FP32">
562
+ <dim>128</dim>
563
+ <dim>128</dim>
564
+ </port>
565
+ </input>
566
+ <output>
567
+ <port id="2" precision="FP32">
568
+ <dim>-1</dim>
569
+ <dim>-1</dim>
570
+ <dim>128</dim>
571
+ </port>
572
+ </output>
573
+ </layer>
574
+ <layer id="42" name="Constant_996320" type="Const" version="opset1">
575
+ <data element_type="f32" shape="1, 1, 128" offset="16027224" size="512" />
576
+ <output>
577
+ <port id="0" precision="FP32">
578
+ <dim>1</dim>
579
+ <dim>1</dim>
580
+ <dim>128</dim>
581
+ </port>
582
+ </output>
583
+ </layer>
584
+ <layer id="43" name="__module.bert.encoder.layer.0.attention.self.key/aten::linear/Add" type="Add" version="opset1">
585
+ <data auto_broadcast="numpy" />
586
+ <input>
587
+ <port id="0" precision="FP32">
588
+ <dim>-1</dim>
589
+ <dim>-1</dim>
590
+ <dim>128</dim>
591
+ </port>
592
+ <port id="1" precision="FP32">
593
+ <dim>1</dim>
594
+ <dim>1</dim>
595
+ <dim>128</dim>
596
+ </port>
597
+ </input>
598
+ <output>
599
+ <port id="2" precision="FP32" names="96,x.5">
600
+ <dim>-1</dim>
601
+ <dim>-1</dim>
602
+ <dim>128</dim>
603
+ </port>
604
+ </output>
605
+ </layer>
606
+ <layer id="44" name="__module.bert.encoder.layer.0.attention.self/prim::ListConstruct/Concat_1" type="Const" version="opset1">
607
+ <data element_type="i64" shape="4" offset="15961624" size="32" />
608
+ <output>
609
+ <port id="0" precision="I64">
610
+ <dim>4</dim>
611
+ </port>
612
+ </output>
613
+ </layer>
614
+ <layer id="45" name="__module.bert.encoder.layer.0.attention.self/aten::view/Reshape_1" type="Reshape" version="opset1">
615
+ <data special_zero="true" />
616
+ <input>
617
+ <port id="0" precision="FP32">
618
+ <dim>-1</dim>
619
+ <dim>-1</dim>
620
+ <dim>128</dim>
621
+ </port>
622
+ <port id="1" precision="I64">
623
+ <dim>4</dim>
624
+ </port>
625
+ </input>
626
+ <output>
627
+ <port id="2" precision="FP32" names="100,x.7">
628
+ <dim>-1</dim>
629
+ <dim>-1</dim>
630
+ <dim>2</dim>
631
+ <dim>64</dim>
632
+ </port>
633
+ </output>
634
+ </layer>
635
+ <layer id="46" name="Constant_992572" type="Const" version="opset1">
636
+ <data element_type="i64" shape="4" offset="15961656" size="32" />
637
+ <output>
638
+ <port id="0" precision="I64" names="101">
639
+ <dim>4</dim>
640
+ </port>
641
+ </output>
642
+ </layer>
643
+ <layer id="47" name="__module.bert.encoder.layer.0.attention.self/aten::permute/Transpose_1" type="Transpose" version="opset1">
644
+ <input>
645
+ <port id="0" precision="FP32">
646
+ <dim>-1</dim>
647
+ <dim>-1</dim>
648
+ <dim>2</dim>
649
+ <dim>64</dim>
650
+ </port>
651
+ <port id="1" precision="I64">
652
+ <dim>4</dim>
653
+ </port>
654
+ </input>
655
+ <output>
656
+ <port id="2" precision="FP32" names="102">
657
+ <dim>-1</dim>
658
+ <dim>2</dim>
659
+ <dim>-1</dim>
660
+ <dim>64</dim>
661
+ </port>
662
+ </output>
663
+ </layer>
664
+ <layer id="48" name="self.bert.encoder.layer.0.attention.self.value.weight" type="Const" version="opset1">
665
+ <data element_type="f32" shape="128, 128" offset="16027736" size="65536" />
666
+ <output>
667
+ <port id="0" precision="FP32" names="self.bert.encoder.layer.0.attention.self.value.weight">
668
+ <dim>128</dim>
669
+ <dim>128</dim>
670
+ </port>
671
+ </output>
672
+ </layer>
673
+ <layer id="49" name="__module.bert.encoder.layer.0.attention.self.value/aten::linear/MatMul" type="MatMul" version="opset1">
674
+ <data transpose_a="false" transpose_b="true" />
675
+ <input>
676
+ <port id="0" precision="FP32">
677
+ <dim>-1</dim>
678
+ <dim>-1</dim>
679
+ <dim>128</dim>
680
+ </port>
681
+ <port id="1" precision="FP32">
682
+ <dim>128</dim>
683
+ <dim>128</dim>
684
+ </port>
685
+ </input>
686
+ <output>
687
+ <port id="2" precision="FP32">
688
+ <dim>-1</dim>
689
+ <dim>-1</dim>
690
+ <dim>128</dim>
691
+ </port>
692
+ </output>
693
+ </layer>
694
+ <layer id="50" name="Constant_996321" type="Const" version="opset1">
695
+ <data element_type="f32" shape="1, 1, 128" offset="16093272" size="512" />
696
+ <output>
697
+ <port id="0" precision="FP32">
698
+ <dim>1</dim>
699
+ <dim>1</dim>
700
+ <dim>128</dim>
701
+ </port>
702
+ </output>
703
+ </layer>
704
+ <layer id="51" name="__module.bert.encoder.layer.0.attention.self.value/aten::linear/Add" type="Add" version="opset1">
705
+ <data auto_broadcast="numpy" />
706
+ <input>
707
+ <port id="0" precision="FP32">
708
+ <dim>-1</dim>
709
+ <dim>-1</dim>
710
+ <dim>128</dim>
711
+ </port>
712
+ <port id="1" precision="FP32">
713
+ <dim>1</dim>
714
+ <dim>1</dim>
715
+ <dim>128</dim>
716
+ </port>
717
+ </input>
718
+ <output>
719
+ <port id="2" precision="FP32" names="105,x.9">
720
+ <dim>-1</dim>
721
+ <dim>-1</dim>
722
+ <dim>128</dim>
723
+ </port>
724
+ </output>
725
+ </layer>
726
+ <layer id="52" name="__module.bert.encoder.layer.0.attention.self/prim::ListConstruct/Concat_2" type="Const" version="opset1">
727
+ <data element_type="i64" shape="4" offset="15961624" size="32" />
728
+ <output>
729
+ <port id="0" precision="I64">
730
+ <dim>4</dim>
731
+ </port>
732
+ </output>
733
+ </layer>
734
+ <layer id="53" name="__module.bert.encoder.layer.0.attention.self/aten::view/Reshape_2" type="Reshape" version="opset1">
735
+ <data special_zero="true" />
736
+ <input>
737
+ <port id="0" precision="FP32">
738
+ <dim>-1</dim>
739
+ <dim>-1</dim>
740
+ <dim>128</dim>
741
+ </port>
742
+ <port id="1" precision="I64">
743
+ <dim>4</dim>
744
+ </port>
745
+ </input>
746
+ <output>
747
+ <port id="2" precision="FP32" names="109,x.11">
748
+ <dim>-1</dim>
749
+ <dim>-1</dim>
750
+ <dim>2</dim>
751
+ <dim>64</dim>
752
+ </port>
753
+ </output>
754
+ </layer>
755
+ <layer id="54" name="Constant_992597" type="Const" version="opset1">
756
+ <data element_type="i64" shape="4" offset="15961656" size="32" />
757
+ <output>
758
+ <port id="0" precision="I64" names="110">
759
+ <dim>4</dim>
760
+ </port>
761
+ </output>
762
+ </layer>
763
+ <layer id="55" name="__module.bert.encoder.layer.0.attention.self/aten::permute/Transpose_2" type="Transpose" version="opset1">
764
+ <input>
765
+ <port id="0" precision="FP32">
766
+ <dim>-1</dim>
767
+ <dim>-1</dim>
768
+ <dim>2</dim>
769
+ <dim>64</dim>
770
+ </port>
771
+ <port id="1" precision="I64">
772
+ <dim>4</dim>
773
+ </port>
774
+ </input>
775
+ <output>
776
+ <port id="2" precision="FP32" names="111">
777
+ <dim>-1</dim>
778
+ <dim>2</dim>
779
+ <dim>-1</dim>
780
+ <dim>64</dim>
781
+ </port>
782
+ </output>
783
+ </layer>
784
+ <layer id="56" name="Constant_996323" type="Const" version="opset1">
785
+ <data element_type="f32" shape="1, 1, 1, 1" offset="16093784" size="4" />
786
+ <output>
787
+ <port id="0" precision="FP32">
788
+ <dim>1</dim>
789
+ <dim>1</dim>
790
+ <dim>1</dim>
791
+ <dim>1</dim>
792
+ </port>
793
+ </output>
794
+ </layer>
795
+ <layer id="57" name="30" type="Const" version="opset1">
796
+ <data element_type="i64" shape="" offset="15894540" size="8" />
797
+ <output>
798
+ <port id="0" precision="I64" names="30" />
799
+ </output>
800
+ </layer>
801
+ <layer id="58" name="__module.bert/aten::unsqueeze/Unsqueeze" type="Unsqueeze" version="opset1">
802
+ <input>
803
+ <port id="0" precision="I64">
804
+ <dim>-1</dim>
805
+ <dim>-1</dim>
806
+ </port>
807
+ <port id="1" precision="I64" />
808
+ </input>
809
+ <output>
810
+ <port id="2" precision="I64" names="62">
811
+ <dim>-1</dim>
812
+ <dim>1</dim>
813
+ <dim>-1</dim>
814
+ </port>
815
+ </output>
816
+ </layer>
817
+ <layer id="59" name="20" type="Const" version="opset1">
818
+ <data element_type="i64" shape="" offset="16093788" size="8" />
819
+ <output>
820
+ <port id="0" precision="I64" names="20" />
821
+ </output>
822
+ </layer>
823
+ <layer id="60" name="__module.bert/aten::unsqueeze/Unsqueeze_1" type="Unsqueeze" version="opset1">
824
+ <input>
825
+ <port id="0" precision="I64">
826
+ <dim>-1</dim>
827
+ <dim>1</dim>
828
+ <dim>-1</dim>
829
+ </port>
830
+ <port id="1" precision="I64" />
831
+ </input>
832
+ <output>
833
+ <port id="2" precision="I64" names="63,64">
834
+ <dim>-1</dim>
835
+ <dim>1</dim>
836
+ <dim>1</dim>
837
+ <dim>-1</dim>
838
+ </port>
839
+ </output>
840
+ </layer>
841
+ <layer id="61" name="Constant_996384" type="Const" version="opset1">
842
+ <data element_type="i64" shape="1" offset="15894532" size="8" />
843
+ <output>
844
+ <port id="0" precision="I64">
845
+ <dim>1</dim>
846
+ </port>
847
+ </output>
848
+ </layer>
849
+ <layer id="62" name="Constant_996385" type="Const" version="opset1">
850
+ <data element_type="i64" shape="" offset="15894532" size="8" />
851
+ <output>
852
+ <port id="0" precision="I64" />
853
+ </output>
854
+ </layer>
855
+ <layer id="63" name="Gather_996386" type="Gather" version="opset8">
856
+ <data batch_dims="0" />
857
+ <input>
858
+ <port id="0" precision="I64">
859
+ <dim>2</dim>
860
+ </port>
861
+ <port id="1" precision="I64">
862
+ <dim>1</dim>
863
+ </port>
864
+ <port id="2" precision="I64" />
865
+ </input>
866
+ <output>
867
+ <port id="3" precision="I64" names="59">
868
+ <dim>1</dim>
869
+ </port>
870
+ </output>
871
+ </layer>
872
+ <layer id="64" name="Constant_995955" type="Const" version="opset1">
873
+ <data element_type="i64" shape="1" offset="15894540" size="8" />
874
+ <output>
875
+ <port id="0" precision="I64">
876
+ <dim>1</dim>
877
+ </port>
878
+ </output>
879
+ </layer>
880
+ <layer id="65" name="Constant_996492" type="Const" version="opset1">
881
+ <data element_type="i64" shape="2" offset="16093796" size="16" />
882
+ <output>
883
+ <port id="0" precision="I64">
884
+ <dim>2</dim>
885
+ </port>
886
+ </output>
887
+ </layer>
888
+ <layer id="66" name="Constant_996493" type="Const" version="opset1">
889
+ <data element_type="i64" shape="" offset="15894532" size="8" />
890
+ <output>
891
+ <port id="0" precision="I64" />
892
+ </output>
893
+ </layer>
894
+ <layer id="67" name="Gather_996494" type="Gather" version="opset8">
895
+ <data batch_dims="0" />
896
+ <input>
897
+ <port id="0" precision="I64">
898
+ <dim>2</dim>
899
+ </port>
900
+ <port id="1" precision="I64">
901
+ <dim>2</dim>
902
+ </port>
903
+ <port id="2" precision="I64" />
904
+ </input>
905
+ <output>
906
+ <port id="3" precision="I64">
907
+ <dim>2</dim>
908
+ </port>
909
+ </output>
910
+ </layer>
911
+ <layer id="68" name="__module.bert/prim::ListConstruct/Concat" type="Concat" version="opset1">
912
+ <data axis="0" />
913
+ <input>
914
+ <port id="0" precision="I64">
915
+ <dim>1</dim>
916
+ </port>
917
+ <port id="1" precision="I64">
918
+ <dim>1</dim>
919
+ </port>
920
+ <port id="2" precision="I64">
921
+ <dim>2</dim>
922
+ </port>
923
+ </input>
924
+ <output>
925
+ <port id="3" precision="I64" names="65">
926
+ <dim>4</dim>
927
+ </port>
928
+ </output>
929
+ </layer>
930
+ <layer id="69" name="__module.bert/aten::expand/Broadcast" type="Broadcast" version="opset3">
931
+ <data mode="bidirectional" />
932
+ <input>
933
+ <port id="0" precision="I64">
934
+ <dim>-1</dim>
935
+ <dim>1</dim>
936
+ <dim>1</dim>
937
+ <dim>-1</dim>
938
+ </port>
939
+ <port id="1" precision="I64">
940
+ <dim>4</dim>
941
+ </port>
942
+ </input>
943
+ <output>
944
+ <port id="2" precision="I64" names="66">
945
+ <dim>-1</dim>
946
+ <dim>1</dim>
947
+ <dim>-1</dim>
948
+ <dim>-1</dim>
949
+ </port>
950
+ </output>
951
+ </layer>
952
+ <layer id="70" name="__module.bert/aten::to/Convert" type="Convert" version="opset1">
953
+ <data destination_type="f32" />
954
+ <input>
955
+ <port id="0" precision="I64">
956
+ <dim>-1</dim>
957
+ <dim>1</dim>
958
+ <dim>-1</dim>
959
+ <dim>-1</dim>
960
+ </port>
961
+ </input>
962
+ <output>
963
+ <port id="1" precision="FP32" names="67">
964
+ <dim>-1</dim>
965
+ <dim>1</dim>
966
+ <dim>-1</dim>
967
+ <dim>-1</dim>
968
+ </port>
969
+ </output>
970
+ </layer>
971
+ <layer id="71" name="Constant_996322" type="Const" version="opset1">
972
+ <data element_type="f32" shape="1, 1, 1, 1" offset="16093784" size="4" />
973
+ <output>
974
+ <port id="0" precision="FP32">
975
+ <dim>1</dim>
976
+ <dim>1</dim>
977
+ <dim>1</dim>
978
+ <dim>1</dim>
979
+ </port>
980
+ </output>
981
+ </layer>
982
+ <layer id="72" name="__module.bert/aten::rsub/Multiply" type="Multiply" version="opset1">
983
+ <data auto_broadcast="numpy" />
984
+ <input>
985
+ <port id="0" precision="FP32">
986
+ <dim>-1</dim>
987
+ <dim>1</dim>
988
+ <dim>-1</dim>
989
+ <dim>-1</dim>
990
+ </port>
991
+ <port id="1" precision="FP32">
992
+ <dim>1</dim>
993
+ <dim>1</dim>
994
+ <dim>1</dim>
995
+ <dim>1</dim>
996
+ </port>
997
+ </input>
998
+ <output>
999
+ <port id="2" precision="FP32">
1000
+ <dim>-1</dim>
1001
+ <dim>1</dim>
1002
+ <dim>-1</dim>
1003
+ <dim>-1</dim>
1004
+ </port>
1005
+ </output>
1006
+ </layer>
1007
+ <layer id="73" name="__module.bert/aten::rsub/Subtract" type="Subtract" version="opset1">
1008
+ <data auto_broadcast="numpy" />
1009
+ <input>
1010
+ <port id="0" precision="FP32">
1011
+ <dim>1</dim>
1012
+ <dim>1</dim>
1013
+ <dim>1</dim>
1014
+ <dim>1</dim>
1015
+ </port>
1016
+ <port id="1" precision="FP32">
1017
+ <dim>-1</dim>
1018
+ <dim>1</dim>
1019
+ <dim>-1</dim>
1020
+ <dim>-1</dim>
1021
+ </port>
1022
+ </input>
1023
+ <output>
1024
+ <port id="2" precision="FP32" names="68,inverted_mask">
1025
+ <dim>-1</dim>
1026
+ <dim>1</dim>
1027
+ <dim>-1</dim>
1028
+ <dim>-1</dim>
1029
+ </port>
1030
+ </output>
1031
+ </layer>
1032
+ <layer id="74" name="__module.bert/aten::to/Convert_1" type="Convert" version="opset1">
1033
+ <data destination_type="boolean" />
1034
+ <input>
1035
+ <port id="0" precision="FP32">
1036
+ <dim>-1</dim>
1037
+ <dim>1</dim>
1038
+ <dim>-1</dim>
1039
+ <dim>-1</dim>
1040
+ </port>
1041
+ </input>
1042
+ <output>
1043
+ <port id="1" precision="BOOL" names="69">
1044
+ <dim>-1</dim>
1045
+ <dim>1</dim>
1046
+ <dim>-1</dim>
1047
+ <dim>-1</dim>
1048
+ </port>
1049
+ </output>
1050
+ </layer>
1051
+ <layer id="75" name="__module.bert/aten::masked_fill/ConvertLike" type="Const" version="opset1">
1052
+ <data element_type="f32" shape="" offset="16093812" size="4" />
1053
+ <output>
1054
+ <port id="0" precision="FP32" />
1055
+ </output>
1056
+ </layer>
1057
+ <layer id="76" name="__module.bert/aten::masked_fill/Select" type="Select" version="opset1">
1058
+ <data auto_broadcast="numpy" />
1059
+ <input>
1060
+ <port id="0" precision="BOOL">
1061
+ <dim>-1</dim>
1062
+ <dim>1</dim>
1063
+ <dim>-1</dim>
1064
+ <dim>-1</dim>
1065
+ </port>
1066
+ <port id="1" precision="FP32" />
1067
+ <port id="2" precision="FP32">
1068
+ <dim>-1</dim>
1069
+ <dim>1</dim>
1070
+ <dim>-1</dim>
1071
+ <dim>-1</dim>
1072
+ </port>
1073
+ </input>
1074
+ <output>
1075
+ <port id="3" precision="FP32" names="70">
1076
+ <dim>-1</dim>
1077
+ <dim>1</dim>
1078
+ <dim>-1</dim>
1079
+ <dim>-1</dim>
1080
+ </port>
1081
+ </output>
1082
+ </layer>
1083
+ <layer id="77" name="__module.bert.encoder.layer.0.attention.self/aten::scaled_dot_product_attention/ScaledDotProductAttention" type="ScaledDotProductAttention" version="opset13">
1084
+ <data causal="false" />
1085
+ <input>
1086
+ <port id="0" precision="FP32">
1087
+ <dim>-1</dim>
1088
+ <dim>2</dim>
1089
+ <dim>-1</dim>
1090
+ <dim>64</dim>
1091
+ </port>
1092
+ <port id="1" precision="FP32">
1093
+ <dim>-1</dim>
1094
+ <dim>2</dim>
1095
+ <dim>-1</dim>
1096
+ <dim>64</dim>
1097
+ </port>
1098
+ <port id="2" precision="FP32">
1099
+ <dim>-1</dim>
1100
+ <dim>2</dim>
1101
+ <dim>-1</dim>
1102
+ <dim>64</dim>
1103
+ </port>
1104
+ <port id="3" precision="FP32">
1105
+ <dim>-1</dim>
1106
+ <dim>1</dim>
1107
+ <dim>-1</dim>
1108
+ <dim>-1</dim>
1109
+ </port>
1110
+ </input>
1111
+ <output>
1112
+ <port id="4" precision="FP32" names="112,attn_output.1">
1113
+ <dim>-1</dim>
1114
+ <dim>2</dim>
1115
+ <dim>-1</dim>
1116
+ <dim>64</dim>
1117
+ </port>
1118
+ </output>
1119
+ </layer>
1120
+ <layer id="78" name="__module.bert.encoder.layer.0.attention.self/aten::transpose/ScatterElementsUpdate" type="Const" version="opset1">
1121
+ <data element_type="i32" shape="4" offset="16093816" size="16" />
1122
+ <output>
1123
+ <port id="0" precision="I32">
1124
+ <dim>4</dim>
1125
+ </port>
1126
+ </output>
1127
+ </layer>
1128
+ <layer id="79" name="__module.bert.encoder.layer.0.attention.self/aten::transpose/Transpose" type="Transpose" version="opset1">
1129
+ <input>
1130
+ <port id="0" precision="FP32">
1131
+ <dim>-1</dim>
1132
+ <dim>2</dim>
1133
+ <dim>-1</dim>
1134
+ <dim>64</dim>
1135
+ </port>
1136
+ <port id="1" precision="I32">
1137
+ <dim>4</dim>
1138
+ </port>
1139
+ </input>
1140
+ <output>
1141
+ <port id="2" precision="FP32" names="113,attn_output.3">
1142
+ <dim>-1</dim>
1143
+ <dim>-1</dim>
1144
+ <dim>2</dim>
1145
+ <dim>64</dim>
1146
+ </port>
1147
+ </output>
1148
+ </layer>
1149
+ <layer id="80" name="Constant_996391" type="Const" version="opset1">
1150
+ <data element_type="i64" shape="3" offset="16093832" size="24" />
1151
+ <output>
1152
+ <port id="0" precision="I64">
1153
+ <dim>3</dim>
1154
+ </port>
1155
+ </output>
1156
+ </layer>
1157
+ <layer id="81" name="__module.bert.encoder.layer.0.attention.self/aten::reshape/Reshape" type="Reshape" version="opset1">
1158
+ <data special_zero="true" />
1159
+ <input>
1160
+ <port id="0" precision="FP32">
1161
+ <dim>-1</dim>
1162
+ <dim>-1</dim>
1163
+ <dim>2</dim>
1164
+ <dim>64</dim>
1165
+ </port>
1166
+ <port id="1" precision="I64">
1167
+ <dim>3</dim>
1168
+ </port>
1169
+ </input>
1170
+ <output>
1171
+ <port id="2" precision="FP32" names="115">
1172
+ <dim>-1</dim>
1173
+ <dim>-1</dim>
1174
+ <dim>128</dim>
1175
+ </port>
1176
+ </output>
1177
+ </layer>
1178
+ <layer id="82" name="self.bert.encoder.layer.0.attention.output.dense.weight" type="Const" version="opset1">
1179
+ <data element_type="f32" shape="128, 128" offset="16093856" size="65536" />
1180
+ <output>
1181
+ <port id="0" precision="FP32" names="self.bert.encoder.layer.0.attention.output.dense.weight">
1182
+ <dim>128</dim>
1183
+ <dim>128</dim>
1184
+ </port>
1185
+ </output>
1186
+ </layer>
1187
+ <layer id="83" name="__module.bert.encoder.layer.0.attention.output.dense/aten::linear/MatMul" type="MatMul" version="opset1">
1188
+ <data transpose_a="false" transpose_b="true" />
1189
+ <input>
1190
+ <port id="0" precision="FP32">
1191
+ <dim>-1</dim>
1192
+ <dim>-1</dim>
1193
+ <dim>128</dim>
1194
+ </port>
1195
+ <port id="1" precision="FP32">
1196
+ <dim>128</dim>
1197
+ <dim>128</dim>
1198
+ </port>
1199
+ </input>
1200
+ <output>
1201
+ <port id="2" precision="FP32">
1202
+ <dim>-1</dim>
1203
+ <dim>-1</dim>
1204
+ <dim>128</dim>
1205
+ </port>
1206
+ </output>
1207
+ </layer>
1208
+ <layer id="84" name="Constant_996324" type="Const" version="opset1">
1209
+ <data element_type="f32" shape="1, 1, 128" offset="16159392" size="512" />
1210
+ <output>
1211
+ <port id="0" precision="FP32">
1212
+ <dim>1</dim>
1213
+ <dim>1</dim>
1214
+ <dim>128</dim>
1215
+ </port>
1216
+ </output>
1217
+ </layer>
1218
+ <layer id="85" name="__module.bert.encoder.layer.0.attention.output.dense/aten::linear/Add" type="Add" version="opset1">
1219
+ <data auto_broadcast="numpy" />
1220
+ <input>
1221
+ <port id="0" precision="FP32">
1222
+ <dim>-1</dim>
1223
+ <dim>-1</dim>
1224
+ <dim>128</dim>
1225
+ </port>
1226
+ <port id="1" precision="FP32">
1227
+ <dim>1</dim>
1228
+ <dim>1</dim>
1229
+ <dim>128</dim>
1230
+ </port>
1231
+ </input>
1232
+ <output>
1233
+ <port id="2" precision="FP32" names="120,input.3">
1234
+ <dim>-1</dim>
1235
+ <dim>-1</dim>
1236
+ <dim>128</dim>
1237
+ </port>
1238
+ </output>
1239
+ </layer>
1240
+ <layer id="86" name="__module.bert.encoder.layer.0.attention.output/aten::add/Add" type="Add" version="opset1">
1241
+ <data auto_broadcast="numpy" />
1242
+ <input>
1243
+ <port id="0" precision="FP32">
1244
+ <dim>-1</dim>
1245
+ <dim>-1</dim>
1246
+ <dim>128</dim>
1247
+ </port>
1248
+ <port id="1" precision="FP32">
1249
+ <dim>-1</dim>
1250
+ <dim>-1</dim>
1251
+ <dim>128</dim>
1252
+ </port>
1253
+ </input>
1254
+ <output>
1255
+ <port id="2" precision="FP32" names="122">
1256
+ <dim>-1</dim>
1257
+ <dim>-1</dim>
1258
+ <dim>128</dim>
1259
+ </port>
1260
+ </output>
1261
+ </layer>
1262
+ <layer id="87" name="__module.bert.encoder.layer.0.attention.output.LayerNorm/aten::layer_norm/Multiply" type="Const" version="opset1">
1263
+ <data element_type="i32" shape="1" offset="15894548" size="4" />
1264
+ <output>
1265
+ <port id="0" precision="I32">
1266
+ <dim>1</dim>
1267
+ </port>
1268
+ </output>
1269
+ </layer>
1270
+ <layer id="88" name="__module.bert.encoder.layer.0.attention.output.LayerNorm/aten::layer_norm/MVN" type="MVN" version="opset6">
1271
+ <data eps="9.999999960041972e-13" normalize_variance="true" eps_mode="INSIDE_SQRT" />
1272
+ <input>
1273
+ <port id="0" precision="FP32">
1274
+ <dim>-1</dim>
1275
+ <dim>-1</dim>
1276
+ <dim>128</dim>
1277
+ </port>
1278
+ <port id="1" precision="I32">
1279
+ <dim>1</dim>
1280
+ </port>
1281
+ </input>
1282
+ <output>
1283
+ <port id="2" precision="FP32">
1284
+ <dim>-1</dim>
1285
+ <dim>-1</dim>
1286
+ <dim>128</dim>
1287
+ </port>
1288
+ </output>
1289
+ </layer>
1290
+ <layer id="89" name="Constant_996325" type="Const" version="opset1">
1291
+ <data element_type="f32" shape="1, 1, 128" offset="16159904" size="512" />
1292
+ <output>
1293
+ <port id="0" precision="FP32">
1294
+ <dim>1</dim>
1295
+ <dim>1</dim>
1296
+ <dim>128</dim>
1297
+ </port>
1298
+ </output>
1299
+ </layer>
1300
+ <layer id="90" name="__module.bert.encoder.layer.0.attention.output.LayerNorm/aten::layer_norm/Multiply_1" type="Multiply" version="opset1">
1301
+ <data auto_broadcast="numpy" />
1302
+ <input>
1303
+ <port id="0" precision="FP32">
1304
+ <dim>-1</dim>
1305
+ <dim>-1</dim>
1306
+ <dim>128</dim>
1307
+ </port>
1308
+ <port id="1" precision="FP32">
1309
+ <dim>1</dim>
1310
+ <dim>1</dim>
1311
+ <dim>128</dim>
1312
+ </port>
1313
+ </input>
1314
+ <output>
1315
+ <port id="2" precision="FP32">
1316
+ <dim>-1</dim>
1317
+ <dim>-1</dim>
1318
+ <dim>128</dim>
1319
+ </port>
1320
+ </output>
1321
+ </layer>
1322
+ <layer id="91" name="Constant_996326" type="Const" version="opset1">
1323
+ <data element_type="f32" shape="1, 1, 128" offset="16160416" size="512" />
1324
+ <output>
1325
+ <port id="0" precision="FP32">
1326
+ <dim>1</dim>
1327
+ <dim>1</dim>
1328
+ <dim>128</dim>
1329
+ </port>
1330
+ </output>
1331
+ </layer>
1332
+ <layer id="92" name="__module.bert.encoder.layer.0.attention.output.LayerNorm/aten::layer_norm/Add" type="Add" version="opset1">
1333
+ <data auto_broadcast="numpy" />
1334
+ <input>
1335
+ <port id="0" precision="FP32">
1336
+ <dim>-1</dim>
1337
+ <dim>-1</dim>
1338
+ <dim>128</dim>
1339
+ </port>
1340
+ <port id="1" precision="FP32">
1341
+ <dim>1</dim>
1342
+ <dim>1</dim>
1343
+ <dim>128</dim>
1344
+ </port>
1345
+ </input>
1346
+ <output>
1347
+ <port id="2" precision="FP32" names="126,input_tensor.1">
1348
+ <dim>-1</dim>
1349
+ <dim>-1</dim>
1350
+ <dim>128</dim>
1351
+ </port>
1352
+ </output>
1353
+ </layer>
1354
+ <layer id="93" name="self.bert.encoder.layer.0.intermediate.dense.weight" type="Const" version="opset1">
1355
+ <data element_type="f32" shape="512, 128" offset="16160928" size="262144" />
1356
+ <output>
1357
+ <port id="0" precision="FP32" names="self.bert.encoder.layer.0.intermediate.dense.weight">
1358
+ <dim>512</dim>
1359
+ <dim>128</dim>
1360
+ </port>
1361
+ </output>
1362
+ </layer>
1363
+ <layer id="94" name="__module.bert.encoder.layer.0.intermediate.dense/aten::linear/MatMul" type="MatMul" version="opset1">
1364
+ <data transpose_a="false" transpose_b="true" />
1365
+ <input>
1366
+ <port id="0" precision="FP32">
1367
+ <dim>-1</dim>
1368
+ <dim>-1</dim>
1369
+ <dim>128</dim>
1370
+ </port>
1371
+ <port id="1" precision="FP32">
1372
+ <dim>512</dim>
1373
+ <dim>128</dim>
1374
+ </port>
1375
+ </input>
1376
+ <output>
1377
+ <port id="2" precision="FP32">
1378
+ <dim>-1</dim>
1379
+ <dim>-1</dim>
1380
+ <dim>512</dim>
1381
+ </port>
1382
+ </output>
1383
+ </layer>
1384
+ <layer id="95" name="Constant_996327" type="Const" version="opset1">
1385
+ <data element_type="f32" shape="1, 1, 512" offset="16423072" size="2048" />
1386
+ <output>
1387
+ <port id="0" precision="FP32">
1388
+ <dim>1</dim>
1389
+ <dim>1</dim>
1390
+ <dim>512</dim>
1391
+ </port>
1392
+ </output>
1393
+ </layer>
1394
+ <layer id="96" name="__module.bert.encoder.layer.0.intermediate.dense/aten::linear/Add" type="Add" version="opset1">
1395
+ <data auto_broadcast="numpy" />
1396
+ <input>
1397
+ <port id="0" precision="FP32">
1398
+ <dim>-1</dim>
1399
+ <dim>-1</dim>
1400
+ <dim>512</dim>
1401
+ </port>
1402
+ <port id="1" precision="FP32">
1403
+ <dim>1</dim>
1404
+ <dim>1</dim>
1405
+ <dim>512</dim>
1406
+ </port>
1407
+ </input>
1408
+ <output>
1409
+ <port id="2" precision="FP32" names="130">
1410
+ <dim>-1</dim>
1411
+ <dim>-1</dim>
1412
+ <dim>512</dim>
1413
+ </port>
1414
+ </output>
1415
+ </layer>
1416
+ <layer id="97" name="__module.bert.encoder.layer.0.intermediate.intermediate_act_fn/aten::gelu/Gelu" type="Gelu" version="opset7">
1417
+ <data approximation_mode="ERF" />
1418
+ <input>
1419
+ <port id="0" precision="FP32">
1420
+ <dim>-1</dim>
1421
+ <dim>-1</dim>
1422
+ <dim>512</dim>
1423
+ </port>
1424
+ </input>
1425
+ <output>
1426
+ <port id="1" precision="FP32" names="131">
1427
+ <dim>-1</dim>
1428
+ <dim>-1</dim>
1429
+ <dim>512</dim>
1430
+ </port>
1431
+ </output>
1432
+ </layer>
1433
+ <layer id="98" name="self.bert.encoder.layer.0.output.dense.weight" type="Const" version="opset1">
1434
+ <data element_type="f32" shape="128, 512" offset="16425120" size="262144" />
1435
+ <output>
1436
+ <port id="0" precision="FP32" names="self.bert.encoder.layer.0.output.dense.weight">
1437
+ <dim>128</dim>
1438
+ <dim>512</dim>
1439
+ </port>
1440
+ </output>
1441
+ </layer>
1442
+ <layer id="99" name="__module.bert.encoder.layer.0.output.dense/aten::linear/MatMul" type="MatMul" version="opset1">
1443
+ <data transpose_a="false" transpose_b="true" />
1444
+ <input>
1445
+ <port id="0" precision="FP32">
1446
+ <dim>-1</dim>
1447
+ <dim>-1</dim>
1448
+ <dim>512</dim>
1449
+ </port>
1450
+ <port id="1" precision="FP32">
1451
+ <dim>128</dim>
1452
+ <dim>512</dim>
1453
+ </port>
1454
+ </input>
1455
+ <output>
1456
+ <port id="2" precision="FP32">
1457
+ <dim>-1</dim>
1458
+ <dim>-1</dim>
1459
+ <dim>128</dim>
1460
+ </port>
1461
+ </output>
1462
+ </layer>
1463
+ <layer id="100" name="Constant_996328" type="Const" version="opset1">
1464
+ <data element_type="f32" shape="1, 1, 128" offset="16687264" size="512" />
1465
+ <output>
1466
+ <port id="0" precision="FP32">
1467
+ <dim>1</dim>
1468
+ <dim>1</dim>
1469
+ <dim>128</dim>
1470
+ </port>
1471
+ </output>
1472
+ </layer>
1473
+ <layer id="101" name="__module.bert.encoder.layer.0.output.dense/aten::linear/Add" type="Add" version="opset1">
1474
+ <data auto_broadcast="numpy" />
1475
+ <input>
1476
+ <port id="0" precision="FP32">
1477
+ <dim>-1</dim>
1478
+ <dim>-1</dim>
1479
+ <dim>128</dim>
1480
+ </port>
1481
+ <port id="1" precision="FP32">
1482
+ <dim>1</dim>
1483
+ <dim>1</dim>
1484
+ <dim>128</dim>
1485
+ </port>
1486
+ </input>
1487
+ <output>
1488
+ <port id="2" precision="FP32" names="136,input.5">
1489
+ <dim>-1</dim>
1490
+ <dim>-1</dim>
1491
+ <dim>128</dim>
1492
+ </port>
1493
+ </output>
1494
+ </layer>
1495
+ <layer id="102" name="__module.bert.encoder.layer.0.output/aten::add/Add" type="Add" version="opset1">
1496
+ <data auto_broadcast="numpy" />
1497
+ <input>
1498
+ <port id="0" precision="FP32">
1499
+ <dim>-1</dim>
1500
+ <dim>-1</dim>
1501
+ <dim>128</dim>
1502
+ </port>
1503
+ <port id="1" precision="FP32">
1504
+ <dim>-1</dim>
1505
+ <dim>-1</dim>
1506
+ <dim>128</dim>
1507
+ </port>
1508
+ </input>
1509
+ <output>
1510
+ <port id="2" precision="FP32" names="138">
1511
+ <dim>-1</dim>
1512
+ <dim>-1</dim>
1513
+ <dim>128</dim>
1514
+ </port>
1515
+ </output>
1516
+ </layer>
1517
+ <layer id="103" name="__module.bert.encoder.layer.0.output.LayerNorm/aten::layer_norm/Multiply" type="Const" version="opset1">
1518
+ <data element_type="i32" shape="1" offset="15894548" size="4" />
1519
+ <output>
1520
+ <port id="0" precision="I32">
1521
+ <dim>1</dim>
1522
+ </port>
1523
+ </output>
1524
+ </layer>
1525
+ <layer id="104" name="__module.bert.encoder.layer.0.output.LayerNorm/aten::layer_norm/MVN" type="MVN" version="opset6">
1526
+ <data eps="9.999999960041972e-13" normalize_variance="true" eps_mode="INSIDE_SQRT" />
1527
+ <input>
1528
+ <port id="0" precision="FP32">
1529
+ <dim>-1</dim>
1530
+ <dim>-1</dim>
1531
+ <dim>128</dim>
1532
+ </port>
1533
+ <port id="1" precision="I32">
1534
+ <dim>1</dim>
1535
+ </port>
1536
+ </input>
1537
+ <output>
1538
+ <port id="2" precision="FP32">
1539
+ <dim>-1</dim>
1540
+ <dim>-1</dim>
1541
+ <dim>128</dim>
1542
+ </port>
1543
+ </output>
1544
+ </layer>
1545
+ <layer id="105" name="Constant_996329" type="Const" version="opset1">
1546
+ <data element_type="f32" shape="1, 1, 128" offset="16687776" size="512" />
1547
+ <output>
1548
+ <port id="0" precision="FP32">
1549
+ <dim>1</dim>
1550
+ <dim>1</dim>
1551
+ <dim>128</dim>
1552
+ </port>
1553
+ </output>
1554
+ </layer>
1555
+ <layer id="106" name="__module.bert.encoder.layer.0.output.LayerNorm/aten::layer_norm/Multiply_1" type="Multiply" version="opset1">
1556
+ <data auto_broadcast="numpy" />
1557
+ <input>
1558
+ <port id="0" precision="FP32">
1559
+ <dim>-1</dim>
1560
+ <dim>-1</dim>
1561
+ <dim>128</dim>
1562
+ </port>
1563
+ <port id="1" precision="FP32">
1564
+ <dim>1</dim>
1565
+ <dim>1</dim>
1566
+ <dim>128</dim>
1567
+ </port>
1568
+ </input>
1569
+ <output>
1570
+ <port id="2" precision="FP32">
1571
+ <dim>-1</dim>
1572
+ <dim>-1</dim>
1573
+ <dim>128</dim>
1574
+ </port>
1575
+ </output>
1576
+ </layer>
1577
+ <layer id="107" name="Constant_996330" type="Const" version="opset1">
1578
+ <data element_type="f32" shape="1, 1, 128" offset="16688288" size="512" />
1579
+ <output>
1580
+ <port id="0" precision="FP32">
1581
+ <dim>1</dim>
1582
+ <dim>1</dim>
1583
+ <dim>128</dim>
1584
+ </port>
1585
+ </output>
1586
+ </layer>
1587
+ <layer id="108" name="__module.bert.encoder.layer.0.output.LayerNorm/aten::layer_norm/Add" type="Add" version="opset1">
1588
+ <data auto_broadcast="numpy" />
1589
+ <input>
1590
+ <port id="0" precision="FP32">
1591
+ <dim>-1</dim>
1592
+ <dim>-1</dim>
1593
+ <dim>128</dim>
1594
+ </port>
1595
+ <port id="1" precision="FP32">
1596
+ <dim>1</dim>
1597
+ <dim>1</dim>
1598
+ <dim>128</dim>
1599
+ </port>
1600
+ </input>
1601
+ <output>
1602
+ <port id="2" precision="FP32" names="142,hidden_states.7">
1603
+ <dim>-1</dim>
1604
+ <dim>-1</dim>
1605
+ <dim>128</dim>
1606
+ </port>
1607
+ </output>
1608
+ </layer>
1609
+ <layer id="109" name="self.bert.encoder.layer.1.attention.self.query.weight" type="Const" version="opset1">
1610
+ <data element_type="f32" shape="128, 128" offset="16688800" size="65536" />
1611
+ <output>
1612
+ <port id="0" precision="FP32" names="self.bert.encoder.layer.1.attention.self.query.weight">
1613
+ <dim>128</dim>
1614
+ <dim>128</dim>
1615
+ </port>
1616
+ </output>
1617
+ </layer>
1618
+ <layer id="110" name="__module.bert.encoder.layer.1.attention.self.query/aten::linear/MatMul" type="MatMul" version="opset1">
1619
+ <data transpose_a="false" transpose_b="true" />
1620
+ <input>
1621
+ <port id="0" precision="FP32">
1622
+ <dim>-1</dim>
1623
+ <dim>-1</dim>
1624
+ <dim>128</dim>
1625
+ </port>
1626
+ <port id="1" precision="FP32">
1627
+ <dim>128</dim>
1628
+ <dim>128</dim>
1629
+ </port>
1630
+ </input>
1631
+ <output>
1632
+ <port id="2" precision="FP32">
1633
+ <dim>-1</dim>
1634
+ <dim>-1</dim>
1635
+ <dim>128</dim>
1636
+ </port>
1637
+ </output>
1638
+ </layer>
1639
+ <layer id="111" name="Constant_996331" type="Const" version="opset1">
1640
+ <data element_type="f32" shape="1, 1, 128" offset="16754336" size="512" />
1641
+ <output>
1642
+ <port id="0" precision="FP32">
1643
+ <dim>1</dim>
1644
+ <dim>1</dim>
1645
+ <dim>128</dim>
1646
+ </port>
1647
+ </output>
1648
+ </layer>
1649
+ <layer id="112" name="__module.bert.encoder.layer.1.attention.self.query/aten::linear/Add" type="Add" version="opset1">
1650
+ <data auto_broadcast="numpy" />
1651
+ <input>
1652
+ <port id="0" precision="FP32">
1653
+ <dim>-1</dim>
1654
+ <dim>-1</dim>
1655
+ <dim>128</dim>
1656
+ </port>
1657
+ <port id="1" precision="FP32">
1658
+ <dim>1</dim>
1659
+ <dim>1</dim>
1660
+ <dim>128</dim>
1661
+ </port>
1662
+ </input>
1663
+ <output>
1664
+ <port id="2" precision="FP32" names="155,x.13">
1665
+ <dim>-1</dim>
1666
+ <dim>-1</dim>
1667
+ <dim>128</dim>
1668
+ </port>
1669
+ </output>
1670
+ </layer>
1671
+ <layer id="113" name="__module.bert.encoder.layer.1.attention.self/prim::ListConstruct/Concat" type="Const" version="opset1">
1672
+ <data element_type="i64" shape="4" offset="15961624" size="32" />
1673
+ <output>
1674
+ <port id="0" precision="I64">
1675
+ <dim>4</dim>
1676
+ </port>
1677
+ </output>
1678
+ </layer>
1679
+ <layer id="114" name="__module.bert.encoder.layer.1.attention.self/aten::view/Reshape" type="Reshape" version="opset1">
1680
+ <data special_zero="true" />
1681
+ <input>
1682
+ <port id="0" precision="FP32">
1683
+ <dim>-1</dim>
1684
+ <dim>-1</dim>
1685
+ <dim>128</dim>
1686
+ </port>
1687
+ <port id="1" precision="I64">
1688
+ <dim>4</dim>
1689
+ </port>
1690
+ </input>
1691
+ <output>
1692
+ <port id="2" precision="FP32" names="159,x.15">
1693
+ <dim>-1</dim>
1694
+ <dim>-1</dim>
1695
+ <dim>2</dim>
1696
+ <dim>64</dim>
1697
+ </port>
1698
+ </output>
1699
+ </layer>
1700
+ <layer id="115" name="Constant_992776" type="Const" version="opset1">
1701
+ <data element_type="i64" shape="4" offset="15961656" size="32" />
1702
+ <output>
1703
+ <port id="0" precision="I64" names="160">
1704
+ <dim>4</dim>
1705
+ </port>
1706
+ </output>
1707
+ </layer>
1708
+ <layer id="116" name="__module.bert.encoder.layer.1.attention.self/aten::permute/Transpose" type="Transpose" version="opset1">
1709
+ <input>
1710
+ <port id="0" precision="FP32">
1711
+ <dim>-1</dim>
1712
+ <dim>-1</dim>
1713
+ <dim>2</dim>
1714
+ <dim>64</dim>
1715
+ </port>
1716
+ <port id="1" precision="I64">
1717
+ <dim>4</dim>
1718
+ </port>
1719
+ </input>
1720
+ <output>
1721
+ <port id="2" precision="FP32" names="161">
1722
+ <dim>-1</dim>
1723
+ <dim>2</dim>
1724
+ <dim>-1</dim>
1725
+ <dim>64</dim>
1726
+ </port>
1727
+ </output>
1728
+ </layer>
1729
+ <layer id="117" name="self.bert.encoder.layer.1.attention.self.key.weight" type="Const" version="opset1">
1730
+ <data element_type="f32" shape="128, 128" offset="16754848" size="65536" />
1731
+ <output>
1732
+ <port id="0" precision="FP32" names="self.bert.encoder.layer.1.attention.self.key.weight">
1733
+ <dim>128</dim>
1734
+ <dim>128</dim>
1735
+ </port>
1736
+ </output>
1737
+ </layer>
1738
+ <layer id="118" name="__module.bert.encoder.layer.1.attention.self.key/aten::linear/MatMul" type="MatMul" version="opset1">
1739
+ <data transpose_a="false" transpose_b="true" />
1740
+ <input>
1741
+ <port id="0" precision="FP32">
1742
+ <dim>-1</dim>
1743
+ <dim>-1</dim>
1744
+ <dim>128</dim>
1745
+ </port>
1746
+ <port id="1" precision="FP32">
1747
+ <dim>128</dim>
1748
+ <dim>128</dim>
1749
+ </port>
1750
+ </input>
1751
+ <output>
1752
+ <port id="2" precision="FP32">
1753
+ <dim>-1</dim>
1754
+ <dim>-1</dim>
1755
+ <dim>128</dim>
1756
+ </port>
1757
+ </output>
1758
+ </layer>
1759
+ <layer id="119" name="Constant_996332" type="Const" version="opset1">
1760
+ <data element_type="f32" shape="1, 1, 128" offset="16820384" size="512" />
1761
+ <output>
1762
+ <port id="0" precision="FP32">
1763
+ <dim>1</dim>
1764
+ <dim>1</dim>
1765
+ <dim>128</dim>
1766
+ </port>
1767
+ </output>
1768
+ </layer>
1769
+ <layer id="120" name="__module.bert.encoder.layer.1.attention.self.key/aten::linear/Add" type="Add" version="opset1">
1770
+ <data auto_broadcast="numpy" />
1771
+ <input>
1772
+ <port id="0" precision="FP32">
1773
+ <dim>-1</dim>
1774
+ <dim>-1</dim>
1775
+ <dim>128</dim>
1776
+ </port>
1777
+ <port id="1" precision="FP32">
1778
+ <dim>1</dim>
1779
+ <dim>1</dim>
1780
+ <dim>128</dim>
1781
+ </port>
1782
+ </input>
1783
+ <output>
1784
+ <port id="2" precision="FP32" names="164,x.17">
1785
+ <dim>-1</dim>
1786
+ <dim>-1</dim>
1787
+ <dim>128</dim>
1788
+ </port>
1789
+ </output>
1790
+ </layer>
1791
+ <layer id="121" name="__module.bert.encoder.layer.1.attention.self/prim::ListConstruct/Concat_1" type="Const" version="opset1">
1792
+ <data element_type="i64" shape="4" offset="15961624" size="32" />
1793
+ <output>
1794
+ <port id="0" precision="I64">
1795
+ <dim>4</dim>
1796
+ </port>
1797
+ </output>
1798
+ </layer>
1799
+ <layer id="122" name="__module.bert.encoder.layer.1.attention.self/aten::view/Reshape_1" type="Reshape" version="opset1">
1800
+ <data special_zero="true" />
1801
+ <input>
1802
+ <port id="0" precision="FP32">
1803
+ <dim>-1</dim>
1804
+ <dim>-1</dim>
1805
+ <dim>128</dim>
1806
+ </port>
1807
+ <port id="1" precision="I64">
1808
+ <dim>4</dim>
1809
+ </port>
1810
+ </input>
1811
+ <output>
1812
+ <port id="2" precision="FP32" names="168,x.19">
1813
+ <dim>-1</dim>
1814
+ <dim>-1</dim>
1815
+ <dim>2</dim>
1816
+ <dim>64</dim>
1817
+ </port>
1818
+ </output>
1819
+ </layer>
1820
+ <layer id="123" name="Constant_992799" type="Const" version="opset1">
1821
+ <data element_type="i64" shape="4" offset="15961656" size="32" />
1822
+ <output>
1823
+ <port id="0" precision="I64" names="169">
1824
+ <dim>4</dim>
1825
+ </port>
1826
+ </output>
1827
+ </layer>
1828
+ <layer id="124" name="__module.bert.encoder.layer.1.attention.self/aten::permute/Transpose_1" type="Transpose" version="opset1">
1829
+ <input>
1830
+ <port id="0" precision="FP32">
1831
+ <dim>-1</dim>
1832
+ <dim>-1</dim>
1833
+ <dim>2</dim>
1834
+ <dim>64</dim>
1835
+ </port>
1836
+ <port id="1" precision="I64">
1837
+ <dim>4</dim>
1838
+ </port>
1839
+ </input>
1840
+ <output>
1841
+ <port id="2" precision="FP32" names="170">
1842
+ <dim>-1</dim>
1843
+ <dim>2</dim>
1844
+ <dim>-1</dim>
1845
+ <dim>64</dim>
1846
+ </port>
1847
+ </output>
1848
+ </layer>
1849
+ <layer id="125" name="self.bert.encoder.layer.1.attention.self.value.weight" type="Const" version="opset1">
1850
+ <data element_type="f32" shape="128, 128" offset="16820896" size="65536" />
1851
+ <output>
1852
+ <port id="0" precision="FP32" names="self.bert.encoder.layer.1.attention.self.value.weight">
1853
+ <dim>128</dim>
1854
+ <dim>128</dim>
1855
+ </port>
1856
+ </output>
1857
+ </layer>
1858
+ <layer id="126" name="__module.bert.encoder.layer.1.attention.self.value/aten::linear/MatMul" type="MatMul" version="opset1">
1859
+ <data transpose_a="false" transpose_b="true" />
1860
+ <input>
1861
+ <port id="0" precision="FP32">
1862
+ <dim>-1</dim>
1863
+ <dim>-1</dim>
1864
+ <dim>128</dim>
1865
+ </port>
1866
+ <port id="1" precision="FP32">
1867
+ <dim>128</dim>
1868
+ <dim>128</dim>
1869
+ </port>
1870
+ </input>
1871
+ <output>
1872
+ <port id="2" precision="FP32">
1873
+ <dim>-1</dim>
1874
+ <dim>-1</dim>
1875
+ <dim>128</dim>
1876
+ </port>
1877
+ </output>
1878
+ </layer>
1879
+ <layer id="127" name="Constant_996333" type="Const" version="opset1">
1880
+ <data element_type="f32" shape="1, 1, 128" offset="16886432" size="512" />
1881
+ <output>
1882
+ <port id="0" precision="FP32">
1883
+ <dim>1</dim>
1884
+ <dim>1</dim>
1885
+ <dim>128</dim>
1886
+ </port>
1887
+ </output>
1888
+ </layer>
1889
+ <layer id="128" name="__module.bert.encoder.layer.1.attention.self.value/aten::linear/Add" type="Add" version="opset1">
1890
+ <data auto_broadcast="numpy" />
1891
+ <input>
1892
+ <port id="0" precision="FP32">
1893
+ <dim>-1</dim>
1894
+ <dim>-1</dim>
1895
+ <dim>128</dim>
1896
+ </port>
1897
+ <port id="1" precision="FP32">
1898
+ <dim>1</dim>
1899
+ <dim>1</dim>
1900
+ <dim>128</dim>
1901
+ </port>
1902
+ </input>
1903
+ <output>
1904
+ <port id="2" precision="FP32" names="173,x.21">
1905
+ <dim>-1</dim>
1906
+ <dim>-1</dim>
1907
+ <dim>128</dim>
1908
+ </port>
1909
+ </output>
1910
+ </layer>
1911
+ <layer id="129" name="__module.bert.encoder.layer.1.attention.self/prim::ListConstruct/Concat_2" type="Const" version="opset1">
1912
+ <data element_type="i64" shape="4" offset="15961624" size="32" />
1913
+ <output>
1914
+ <port id="0" precision="I64">
1915
+ <dim>4</dim>
1916
+ </port>
1917
+ </output>
1918
+ </layer>
1919
+ <layer id="130" name="__module.bert.encoder.layer.1.attention.self/aten::view/Reshape_2" type="Reshape" version="opset1">
1920
+ <data special_zero="true" />
1921
+ <input>
1922
+ <port id="0" precision="FP32">
1923
+ <dim>-1</dim>
1924
+ <dim>-1</dim>
1925
+ <dim>128</dim>
1926
+ </port>
1927
+ <port id="1" precision="I64">
1928
+ <dim>4</dim>
1929
+ </port>
1930
+ </input>
1931
+ <output>
1932
+ <port id="2" precision="FP32" names="177,x">
1933
+ <dim>-1</dim>
1934
+ <dim>-1</dim>
1935
+ <dim>2</dim>
1936
+ <dim>64</dim>
1937
+ </port>
1938
+ </output>
1939
+ </layer>
1940
+ <layer id="131" name="Constant_992822" type="Const" version="opset1">
1941
+ <data element_type="i64" shape="4" offset="15961656" size="32" />
1942
+ <output>
1943
+ <port id="0" precision="I64" names="178">
1944
+ <dim>4</dim>
1945
+ </port>
1946
+ </output>
1947
+ </layer>
1948
+ <layer id="132" name="__module.bert.encoder.layer.1.attention.self/aten::permute/Transpose_2" type="Transpose" version="opset1">
1949
+ <input>
1950
+ <port id="0" precision="FP32">
1951
+ <dim>-1</dim>
1952
+ <dim>-1</dim>
1953
+ <dim>2</dim>
1954
+ <dim>64</dim>
1955
+ </port>
1956
+ <port id="1" precision="I64">
1957
+ <dim>4</dim>
1958
+ </port>
1959
+ </input>
1960
+ <output>
1961
+ <port id="2" precision="FP32" names="179">
1962
+ <dim>-1</dim>
1963
+ <dim>2</dim>
1964
+ <dim>-1</dim>
1965
+ <dim>64</dim>
1966
+ </port>
1967
+ </output>
1968
+ </layer>
1969
+ <layer id="133" name="__module.bert.encoder.layer.1.attention.self/aten::scaled_dot_product_attention/ScaledDotProductAttention" type="ScaledDotProductAttention" version="opset13">
1970
+ <data causal="false" />
1971
+ <input>
1972
+ <port id="0" precision="FP32">
1973
+ <dim>-1</dim>
1974
+ <dim>2</dim>
1975
+ <dim>-1</dim>
1976
+ <dim>64</dim>
1977
+ </port>
1978
+ <port id="1" precision="FP32">
1979
+ <dim>-1</dim>
1980
+ <dim>2</dim>
1981
+ <dim>-1</dim>
1982
+ <dim>64</dim>
1983
+ </port>
1984
+ <port id="2" precision="FP32">
1985
+ <dim>-1</dim>
1986
+ <dim>2</dim>
1987
+ <dim>-1</dim>
1988
+ <dim>64</dim>
1989
+ </port>
1990
+ <port id="3" precision="FP32">
1991
+ <dim>-1</dim>
1992
+ <dim>1</dim>
1993
+ <dim>-1</dim>
1994
+ <dim>-1</dim>
1995
+ </port>
1996
+ </input>
1997
+ <output>
1998
+ <port id="4" precision="FP32" names="180,attn_output.5">
1999
+ <dim>-1</dim>
2000
+ <dim>2</dim>
2001
+ <dim>-1</dim>
2002
+ <dim>64</dim>
2003
+ </port>
2004
+ </output>
2005
+ </layer>
2006
+ <layer id="134" name="__module.bert.encoder.layer.1.attention.self/aten::transpose/ScatterElementsUpdate" type="Const" version="opset1">
2007
+ <data element_type="i32" shape="4" offset="16093816" size="16" />
2008
+ <output>
2009
+ <port id="0" precision="I32">
2010
+ <dim>4</dim>
2011
+ </port>
2012
+ </output>
2013
+ </layer>
2014
+ <layer id="135" name="__module.bert.encoder.layer.1.attention.self/aten::transpose/Transpose" type="Transpose" version="opset1">
2015
+ <input>
2016
+ <port id="0" precision="FP32">
2017
+ <dim>-1</dim>
2018
+ <dim>2</dim>
2019
+ <dim>-1</dim>
2020
+ <dim>64</dim>
2021
+ </port>
2022
+ <port id="1" precision="I32">
2023
+ <dim>4</dim>
2024
+ </port>
2025
+ </input>
2026
+ <output>
2027
+ <port id="2" precision="FP32" names="181,attn_output">
2028
+ <dim>-1</dim>
2029
+ <dim>-1</dim>
2030
+ <dim>2</dim>
2031
+ <dim>64</dim>
2032
+ </port>
2033
+ </output>
2034
+ </layer>
2035
+ <layer id="136" name="Constant_996392" type="Const" version="opset1">
2036
+ <data element_type="i64" shape="3" offset="16093832" size="24" />
2037
+ <output>
2038
+ <port id="0" precision="I64">
2039
+ <dim>3</dim>
2040
+ </port>
2041
+ </output>
2042
+ </layer>
2043
+ <layer id="137" name="__module.bert.encoder.layer.1.attention.self/aten::reshape/Reshape" type="Reshape" version="opset1">
2044
+ <data special_zero="true" />
2045
+ <input>
2046
+ <port id="0" precision="FP32">
2047
+ <dim>-1</dim>
2048
+ <dim>-1</dim>
2049
+ <dim>2</dim>
2050
+ <dim>64</dim>
2051
+ </port>
2052
+ <port id="1" precision="I64">
2053
+ <dim>3</dim>
2054
+ </port>
2055
+ </input>
2056
+ <output>
2057
+ <port id="2" precision="FP32" names="183">
2058
+ <dim>-1</dim>
2059
+ <dim>-1</dim>
2060
+ <dim>128</dim>
2061
+ </port>
2062
+ </output>
2063
+ </layer>
2064
+ <layer id="138" name="self.bert.encoder.layer.1.attention.output.dense.weight" type="Const" version="opset1">
2065
+ <data element_type="f32" shape="128, 128" offset="16886944" size="65536" />
2066
+ <output>
2067
+ <port id="0" precision="FP32" names="self.bert.encoder.layer.1.attention.output.dense.weight">
2068
+ <dim>128</dim>
2069
+ <dim>128</dim>
2070
+ </port>
2071
+ </output>
2072
+ </layer>
2073
+ <layer id="139" name="__module.bert.encoder.layer.1.attention.output.dense/aten::linear/MatMul" type="MatMul" version="opset1">
2074
+ <data transpose_a="false" transpose_b="true" />
2075
+ <input>
2076
+ <port id="0" precision="FP32">
2077
+ <dim>-1</dim>
2078
+ <dim>-1</dim>
2079
+ <dim>128</dim>
2080
+ </port>
2081
+ <port id="1" precision="FP32">
2082
+ <dim>128</dim>
2083
+ <dim>128</dim>
2084
+ </port>
2085
+ </input>
2086
+ <output>
2087
+ <port id="2" precision="FP32">
2088
+ <dim>-1</dim>
2089
+ <dim>-1</dim>
2090
+ <dim>128</dim>
2091
+ </port>
2092
+ </output>
2093
+ </layer>
2094
+ <layer id="140" name="Constant_996334" type="Const" version="opset1">
2095
+ <data element_type="f32" shape="1, 1, 128" offset="16952480" size="512" />
2096
+ <output>
2097
+ <port id="0" precision="FP32">
2098
+ <dim>1</dim>
2099
+ <dim>1</dim>
2100
+ <dim>128</dim>
2101
+ </port>
2102
+ </output>
2103
+ </layer>
2104
+ <layer id="141" name="__module.bert.encoder.layer.1.attention.output.dense/aten::linear/Add" type="Add" version="opset1">
2105
+ <data auto_broadcast="numpy" />
2106
+ <input>
2107
+ <port id="0" precision="FP32">
2108
+ <dim>-1</dim>
2109
+ <dim>-1</dim>
2110
+ <dim>128</dim>
2111
+ </port>
2112
+ <port id="1" precision="FP32">
2113
+ <dim>1</dim>
2114
+ <dim>1</dim>
2115
+ <dim>128</dim>
2116
+ </port>
2117
+ </input>
2118
+ <output>
2119
+ <port id="2" precision="FP32" names="188,input.7">
2120
+ <dim>-1</dim>
2121
+ <dim>-1</dim>
2122
+ <dim>128</dim>
2123
+ </port>
2124
+ </output>
2125
+ </layer>
2126
+ <layer id="142" name="__module.bert.encoder.layer.1.attention.output/aten::add/Add" type="Add" version="opset1">
2127
+ <data auto_broadcast="numpy" />
2128
+ <input>
2129
+ <port id="0" precision="FP32">
2130
+ <dim>-1</dim>
2131
+ <dim>-1</dim>
2132
+ <dim>128</dim>
2133
+ </port>
2134
+ <port id="1" precision="FP32">
2135
+ <dim>-1</dim>
2136
+ <dim>-1</dim>
2137
+ <dim>128</dim>
2138
+ </port>
2139
+ </input>
2140
+ <output>
2141
+ <port id="2" precision="FP32" names="190">
2142
+ <dim>-1</dim>
2143
+ <dim>-1</dim>
2144
+ <dim>128</dim>
2145
+ </port>
2146
+ </output>
2147
+ </layer>
2148
+ <layer id="143" name="__module.bert.encoder.layer.1.attention.output.LayerNorm/aten::layer_norm/Multiply" type="Const" version="opset1">
2149
+ <data element_type="i32" shape="1" offset="15894548" size="4" />
2150
+ <output>
2151
+ <port id="0" precision="I32">
2152
+ <dim>1</dim>
2153
+ </port>
2154
+ </output>
2155
+ </layer>
2156
+ <layer id="144" name="__module.bert.encoder.layer.1.attention.output.LayerNorm/aten::layer_norm/MVN" type="MVN" version="opset6">
2157
+ <data eps="9.999999960041972e-13" normalize_variance="true" eps_mode="INSIDE_SQRT" />
2158
+ <input>
2159
+ <port id="0" precision="FP32">
2160
+ <dim>-1</dim>
2161
+ <dim>-1</dim>
2162
+ <dim>128</dim>
2163
+ </port>
2164
+ <port id="1" precision="I32">
2165
+ <dim>1</dim>
2166
+ </port>
2167
+ </input>
2168
+ <output>
2169
+ <port id="2" precision="FP32">
2170
+ <dim>-1</dim>
2171
+ <dim>-1</dim>
2172
+ <dim>128</dim>
2173
+ </port>
2174
+ </output>
2175
+ </layer>
2176
+ <layer id="145" name="Constant_996335" type="Const" version="opset1">
2177
+ <data element_type="f32" shape="1, 1, 128" offset="16952992" size="512" />
2178
+ <output>
2179
+ <port id="0" precision="FP32">
2180
+ <dim>1</dim>
2181
+ <dim>1</dim>
2182
+ <dim>128</dim>
2183
+ </port>
2184
+ </output>
2185
+ </layer>
2186
+ <layer id="146" name="__module.bert.encoder.layer.1.attention.output.LayerNorm/aten::layer_norm/Multiply_1" type="Multiply" version="opset1">
2187
+ <data auto_broadcast="numpy" />
2188
+ <input>
2189
+ <port id="0" precision="FP32">
2190
+ <dim>-1</dim>
2191
+ <dim>-1</dim>
2192
+ <dim>128</dim>
2193
+ </port>
2194
+ <port id="1" precision="FP32">
2195
+ <dim>1</dim>
2196
+ <dim>1</dim>
2197
+ <dim>128</dim>
2198
+ </port>
2199
+ </input>
2200
+ <output>
2201
+ <port id="2" precision="FP32">
2202
+ <dim>-1</dim>
2203
+ <dim>-1</dim>
2204
+ <dim>128</dim>
2205
+ </port>
2206
+ </output>
2207
+ </layer>
2208
+ <layer id="147" name="Constant_996336" type="Const" version="opset1">
2209
+ <data element_type="f32" shape="1, 1, 128" offset="16953504" size="512" />
2210
+ <output>
2211
+ <port id="0" precision="FP32">
2212
+ <dim>1</dim>
2213
+ <dim>1</dim>
2214
+ <dim>128</dim>
2215
+ </port>
2216
+ </output>
2217
+ </layer>
2218
+ <layer id="148" name="__module.bert.encoder.layer.1.attention.output.LayerNorm/aten::layer_norm/Add" type="Add" version="opset1">
2219
+ <data auto_broadcast="numpy" />
2220
+ <input>
2221
+ <port id="0" precision="FP32">
2222
+ <dim>-1</dim>
2223
+ <dim>-1</dim>
2224
+ <dim>128</dim>
2225
+ </port>
2226
+ <port id="1" precision="FP32">
2227
+ <dim>1</dim>
2228
+ <dim>1</dim>
2229
+ <dim>128</dim>
2230
+ </port>
2231
+ </input>
2232
+ <output>
2233
+ <port id="2" precision="FP32" names="194,input_tensor">
2234
+ <dim>-1</dim>
2235
+ <dim>-1</dim>
2236
+ <dim>128</dim>
2237
+ </port>
2238
+ </output>
2239
+ </layer>
2240
+ <layer id="149" name="self.bert.encoder.layer.1.intermediate.dense.weight" type="Const" version="opset1">
2241
+ <data element_type="f32" shape="512, 128" offset="16954016" size="262144" />
2242
+ <output>
2243
+ <port id="0" precision="FP32" names="self.bert.encoder.layer.1.intermediate.dense.weight">
2244
+ <dim>512</dim>
2245
+ <dim>128</dim>
2246
+ </port>
2247
+ </output>
2248
+ </layer>
2249
+ <layer id="150" name="__module.bert.encoder.layer.1.intermediate.dense/aten::linear/MatMul" type="MatMul" version="opset1">
2250
+ <data transpose_a="false" transpose_b="true" />
2251
+ <input>
2252
+ <port id="0" precision="FP32">
2253
+ <dim>-1</dim>
2254
+ <dim>-1</dim>
2255
+ <dim>128</dim>
2256
+ </port>
2257
+ <port id="1" precision="FP32">
2258
+ <dim>512</dim>
2259
+ <dim>128</dim>
2260
+ </port>
2261
+ </input>
2262
+ <output>
2263
+ <port id="2" precision="FP32">
2264
+ <dim>-1</dim>
2265
+ <dim>-1</dim>
2266
+ <dim>512</dim>
2267
+ </port>
2268
+ </output>
2269
+ </layer>
2270
+ <layer id="151" name="Constant_996337" type="Const" version="opset1">
2271
+ <data element_type="f32" shape="1, 1, 512" offset="17216160" size="2048" />
2272
+ <output>
2273
+ <port id="0" precision="FP32">
2274
+ <dim>1</dim>
2275
+ <dim>1</dim>
2276
+ <dim>512</dim>
2277
+ </port>
2278
+ </output>
2279
+ </layer>
2280
+ <layer id="152" name="__module.bert.encoder.layer.1.intermediate.dense/aten::linear/Add" type="Add" version="opset1">
2281
+ <data auto_broadcast="numpy" />
2282
+ <input>
2283
+ <port id="0" precision="FP32">
2284
+ <dim>-1</dim>
2285
+ <dim>-1</dim>
2286
+ <dim>512</dim>
2287
+ </port>
2288
+ <port id="1" precision="FP32">
2289
+ <dim>1</dim>
2290
+ <dim>1</dim>
2291
+ <dim>512</dim>
2292
+ </port>
2293
+ </input>
2294
+ <output>
2295
+ <port id="2" precision="FP32" names="198">
2296
+ <dim>-1</dim>
2297
+ <dim>-1</dim>
2298
+ <dim>512</dim>
2299
+ </port>
2300
+ </output>
2301
+ </layer>
2302
+ <layer id="153" name="__module.bert.encoder.layer.1.intermediate.intermediate_act_fn/aten::gelu/Gelu" type="Gelu" version="opset7">
2303
+ <data approximation_mode="ERF" />
2304
+ <input>
2305
+ <port id="0" precision="FP32">
2306
+ <dim>-1</dim>
2307
+ <dim>-1</dim>
2308
+ <dim>512</dim>
2309
+ </port>
2310
+ </input>
2311
+ <output>
2312
+ <port id="1" precision="FP32" names="199">
2313
+ <dim>-1</dim>
2314
+ <dim>-1</dim>
2315
+ <dim>512</dim>
2316
+ </port>
2317
+ </output>
2318
+ </layer>
2319
+ <layer id="154" name="self.bert.encoder.layer.1.output.dense.weight" type="Const" version="opset1">
2320
+ <data element_type="f32" shape="128, 512" offset="17218208" size="262144" />
2321
+ <output>
2322
+ <port id="0" precision="FP32" names="self.bert.encoder.layer.1.output.dense.weight">
2323
+ <dim>128</dim>
2324
+ <dim>512</dim>
2325
+ </port>
2326
+ </output>
2327
+ </layer>
2328
+ <layer id="155" name="__module.bert.encoder.layer.1.output.dense/aten::linear/MatMul" type="MatMul" version="opset1">
2329
+ <data transpose_a="false" transpose_b="true" />
2330
+ <input>
2331
+ <port id="0" precision="FP32">
2332
+ <dim>-1</dim>
2333
+ <dim>-1</dim>
2334
+ <dim>512</dim>
2335
+ </port>
2336
+ <port id="1" precision="FP32">
2337
+ <dim>128</dim>
2338
+ <dim>512</dim>
2339
+ </port>
2340
+ </input>
2341
+ <output>
2342
+ <port id="2" precision="FP32">
2343
+ <dim>-1</dim>
2344
+ <dim>-1</dim>
2345
+ <dim>128</dim>
2346
+ </port>
2347
+ </output>
2348
+ </layer>
2349
+ <layer id="156" name="Constant_996338" type="Const" version="opset1">
2350
+ <data element_type="f32" shape="1, 1, 128" offset="17480352" size="512" />
2351
+ <output>
2352
+ <port id="0" precision="FP32">
2353
+ <dim>1</dim>
2354
+ <dim>1</dim>
2355
+ <dim>128</dim>
2356
+ </port>
2357
+ </output>
2358
+ </layer>
2359
+ <layer id="157" name="__module.bert.encoder.layer.1.output.dense/aten::linear/Add" type="Add" version="opset1">
2360
+ <data auto_broadcast="numpy" />
2361
+ <input>
2362
+ <port id="0" precision="FP32">
2363
+ <dim>-1</dim>
2364
+ <dim>-1</dim>
2365
+ <dim>128</dim>
2366
+ </port>
2367
+ <port id="1" precision="FP32">
2368
+ <dim>1</dim>
2369
+ <dim>1</dim>
2370
+ <dim>128</dim>
2371
+ </port>
2372
+ </input>
2373
+ <output>
2374
+ <port id="2" precision="FP32" names="204,input.9">
2375
+ <dim>-1</dim>
2376
+ <dim>-1</dim>
2377
+ <dim>128</dim>
2378
+ </port>
2379
+ </output>
2380
+ </layer>
2381
+ <layer id="158" name="__module.bert.encoder.layer.1.output/aten::add/Add" type="Add" version="opset1">
2382
+ <data auto_broadcast="numpy" />
2383
+ <input>
2384
+ <port id="0" precision="FP32">
2385
+ <dim>-1</dim>
2386
+ <dim>-1</dim>
2387
+ <dim>128</dim>
2388
+ </port>
2389
+ <port id="1" precision="FP32">
2390
+ <dim>-1</dim>
2391
+ <dim>-1</dim>
2392
+ <dim>128</dim>
2393
+ </port>
2394
+ </input>
2395
+ <output>
2396
+ <port id="2" precision="FP32" names="206">
2397
+ <dim>-1</dim>
2398
+ <dim>-1</dim>
2399
+ <dim>128</dim>
2400
+ </port>
2401
+ </output>
2402
+ </layer>
2403
+ <layer id="159" name="__module.bert.encoder.layer.1.output.LayerNorm/aten::layer_norm/Multiply" type="Const" version="opset1">
2404
+ <data element_type="i32" shape="1" offset="15894548" size="4" />
2405
+ <output>
2406
+ <port id="0" precision="I32">
2407
+ <dim>1</dim>
2408
+ </port>
2409
+ </output>
2410
+ </layer>
2411
+ <layer id="160" name="__module.bert.encoder.layer.1.output.LayerNorm/aten::layer_norm/MVN" type="MVN" version="opset6">
2412
+ <data eps="9.999999960041972e-13" normalize_variance="true" eps_mode="INSIDE_SQRT" />
2413
+ <input>
2414
+ <port id="0" precision="FP32">
2415
+ <dim>-1</dim>
2416
+ <dim>-1</dim>
2417
+ <dim>128</dim>
2418
+ </port>
2419
+ <port id="1" precision="I32">
2420
+ <dim>1</dim>
2421
+ </port>
2422
+ </input>
2423
+ <output>
2424
+ <port id="2" precision="FP32">
2425
+ <dim>-1</dim>
2426
+ <dim>-1</dim>
2427
+ <dim>128</dim>
2428
+ </port>
2429
+ </output>
2430
+ </layer>
2431
+ <layer id="161" name="Constant_996339" type="Const" version="opset1">
2432
+ <data element_type="f32" shape="1, 1, 128" offset="17480864" size="512" />
2433
+ <output>
2434
+ <port id="0" precision="FP32">
2435
+ <dim>1</dim>
2436
+ <dim>1</dim>
2437
+ <dim>128</dim>
2438
+ </port>
2439
+ </output>
2440
+ </layer>
2441
+ <layer id="162" name="__module.bert.encoder.layer.1.output.LayerNorm/aten::layer_norm/Multiply_1" type="Multiply" version="opset1">
2442
+ <data auto_broadcast="numpy" />
2443
+ <input>
2444
+ <port id="0" precision="FP32">
2445
+ <dim>-1</dim>
2446
+ <dim>-1</dim>
2447
+ <dim>128</dim>
2448
+ </port>
2449
+ <port id="1" precision="FP32">
2450
+ <dim>1</dim>
2451
+ <dim>1</dim>
2452
+ <dim>128</dim>
2453
+ </port>
2454
+ </input>
2455
+ <output>
2456
+ <port id="2" precision="FP32">
2457
+ <dim>-1</dim>
2458
+ <dim>-1</dim>
2459
+ <dim>128</dim>
2460
+ </port>
2461
+ </output>
2462
+ </layer>
2463
+ <layer id="163" name="Constant_996340" type="Const" version="opset1">
2464
+ <data element_type="f32" shape="1, 1, 128" offset="17481376" size="512" />
2465
+ <output>
2466
+ <port id="0" precision="FP32">
2467
+ <dim>1</dim>
2468
+ <dim>1</dim>
2469
+ <dim>128</dim>
2470
+ </port>
2471
+ </output>
2472
+ </layer>
2473
+ <layer id="164" name="__module.bert.encoder.layer.1.output.LayerNorm/aten::layer_norm/Add" type="Add" version="opset1">
2474
+ <data auto_broadcast="numpy" />
2475
+ <input>
2476
+ <port id="0" precision="FP32">
2477
+ <dim>-1</dim>
2478
+ <dim>-1</dim>
2479
+ <dim>128</dim>
2480
+ </port>
2481
+ <port id="1" precision="FP32">
2482
+ <dim>1</dim>
2483
+ <dim>1</dim>
2484
+ <dim>128</dim>
2485
+ </port>
2486
+ </input>
2487
+ <output>
2488
+ <port id="2" precision="FP32" names="210,212,hidden_states">
2489
+ <dim>-1</dim>
2490
+ <dim>-1</dim>
2491
+ <dim>128</dim>
2492
+ </port>
2493
+ </output>
2494
+ </layer>
2495
+ <layer id="165" name="22" type="Const" version="opset1">
2496
+ <data element_type="i64" shape="" offset="15894532" size="8" />
2497
+ <output>
2498
+ <port id="0" precision="I64" names="22" />
2499
+ </output>
2500
+ </layer>
2501
+ <layer id="166" name="__module.bert.pooler/aten::select/Gather" type="Gather" version="opset8">
2502
+ <data batch_dims="0" />
2503
+ <input>
2504
+ <port id="0" precision="FP32">
2505
+ <dim>-1</dim>
2506
+ <dim>-1</dim>
2507
+ <dim>128</dim>
2508
+ </port>
2509
+ <port id="1" precision="I64" />
2510
+ <port id="2" precision="I64" />
2511
+ </input>
2512
+ <output>
2513
+ <port id="3" precision="FP32" names="213">
2514
+ <dim>-1</dim>
2515
+ <dim>128</dim>
2516
+ </port>
2517
+ </output>
2518
+ </layer>
2519
+ <layer id="167" name="self.bert.pooler.dense.weight" type="Const" version="opset1">
2520
+ <data element_type="f32" shape="128, 128" offset="17481888" size="65536" />
2521
+ <output>
2522
+ <port id="0" precision="FP32" names="self.bert.pooler.dense.weight">
2523
+ <dim>128</dim>
2524
+ <dim>128</dim>
2525
+ </port>
2526
+ </output>
2527
+ </layer>
2528
+ <layer id="168" name="__module.bert.pooler.dense/aten::linear/MatMul" type="MatMul" version="opset1">
2529
+ <data transpose_a="false" transpose_b="true" />
2530
+ <input>
2531
+ <port id="0" precision="FP32">
2532
+ <dim>-1</dim>
2533
+ <dim>128</dim>
2534
+ </port>
2535
+ <port id="1" precision="FP32">
2536
+ <dim>128</dim>
2537
+ <dim>128</dim>
2538
+ </port>
2539
+ </input>
2540
+ <output>
2541
+ <port id="2" precision="FP32">
2542
+ <dim>-1</dim>
2543
+ <dim>128</dim>
2544
+ </port>
2545
+ </output>
2546
+ </layer>
2547
+ <layer id="169" name="Constant_996341" type="Const" version="opset1">
2548
+ <data element_type="f32" shape="1, 128" offset="17547424" size="512" />
2549
+ <output>
2550
+ <port id="0" precision="FP32">
2551
+ <dim>1</dim>
2552
+ <dim>128</dim>
2553
+ </port>
2554
+ </output>
2555
+ </layer>
2556
+ <layer id="170" name="__module.bert.pooler.dense/aten::linear/Add" type="Add" version="opset1">
2557
+ <data auto_broadcast="numpy" />
2558
+ <input>
2559
+ <port id="0" precision="FP32">
2560
+ <dim>-1</dim>
2561
+ <dim>128</dim>
2562
+ </port>
2563
+ <port id="1" precision="FP32">
2564
+ <dim>1</dim>
2565
+ <dim>128</dim>
2566
+ </port>
2567
+ </input>
2568
+ <output>
2569
+ <port id="2" precision="FP32" names="216">
2570
+ <dim>-1</dim>
2571
+ <dim>128</dim>
2572
+ </port>
2573
+ </output>
2574
+ </layer>
2575
+ <layer id="171" name="__module.bert.pooler.activation/aten::tanh/Tanh" type="Tanh" version="opset1">
2576
+ <input>
2577
+ <port id="0" precision="FP32">
2578
+ <dim>-1</dim>
2579
+ <dim>128</dim>
2580
+ </port>
2581
+ </input>
2582
+ <output>
2583
+ <port id="1" precision="FP32" names="217,input">
2584
+ <dim>-1</dim>
2585
+ <dim>128</dim>
2586
+ </port>
2587
+ </output>
2588
+ </layer>
2589
+ <layer id="172" name="self.classifier.weight" type="Const" version="opset1">
2590
+ <data element_type="f32" shape="1, 128" offset="17547936" size="512" />
2591
+ <output>
2592
+ <port id="0" precision="FP32" names="self.classifier.weight">
2593
+ <dim>1</dim>
2594
+ <dim>128</dim>
2595
+ </port>
2596
+ </output>
2597
+ </layer>
2598
+ <layer id="173" name="__module.classifier/aten::linear/Add" type="MatMul" version="opset1">
2599
+ <data transpose_a="false" transpose_b="true" />
2600
+ <input>
2601
+ <port id="0" precision="FP32">
2602
+ <dim>-1</dim>
2603
+ <dim>128</dim>
2604
+ </port>
2605
+ <port id="1" precision="FP32">
2606
+ <dim>1</dim>
2607
+ <dim>128</dim>
2608
+ </port>
2609
+ </input>
2610
+ <output>
2611
+ <port id="2" precision="FP32" names="logits">
2612
+ <dim>-1</dim>
2613
+ <dim>1</dim>
2614
+ </port>
2615
+ </output>
2616
+ </layer>
2617
+ <layer id="174" name="Result_993640" type="Result" version="opset1">
2618
+ <input>
2619
+ <port id="0" precision="FP32">
2620
+ <dim>-1</dim>
2621
+ <dim>1</dim>
2622
+ </port>
2623
+ </input>
2624
+ </layer>
2625
+ </layers>
2626
+ <edges>
2627
+ <edge from-layer="0" from-port="0" to-layer="8" to-port="0" />
2628
+ <edge from-layer="1" from-port="0" to-layer="58" to-port="0" />
2629
+ <edge from-layer="2" from-port="0" to-layer="4" to-port="0" />
2630
+ <edge from-layer="2" from-port="0" to-layer="15" to-port="0" />
2631
+ <edge from-layer="3" from-port="0" to-layer="6" to-port="0" />
2632
+ <edge from-layer="4" from-port="1" to-layer="6" to-port="1" />
2633
+ <edge from-layer="5" from-port="0" to-layer="6" to-port="2" />
2634
+ <edge from-layer="6" from-port="3" to-layer="11" to-port="0" />
2635
+ <edge from-layer="7" from-port="0" to-layer="10" to-port="0" />
2636
+ <edge from-layer="8" from-port="1" to-layer="10" to-port="1" />
2637
+ <edge from-layer="9" from-port="0" to-layer="10" to-port="2" />
2638
+ <edge from-layer="10" from-port="3" to-layer="11" to-port="1" />
2639
+ <edge from-layer="11" from-port="2" to-layer="25" to-port="0" />
2640
+ <edge from-layer="12" from-port="0" to-layer="24" to-port="0" />
2641
+ <edge from-layer="13" from-port="0" to-layer="21" to-port="0" />
2642
+ <edge from-layer="14" from-port="0" to-layer="21" to-port="1" />
2643
+ <edge from-layer="15" from-port="1" to-layer="18" to-port="0" />
2644
+ <edge from-layer="15" from-port="1" to-layer="63" to-port="0" />
2645
+ <edge from-layer="15" from-port="1" to-layer="67" to-port="0" />
2646
+ <edge from-layer="16" from-port="0" to-layer="18" to-port="1" />
2647
+ <edge from-layer="17" from-port="0" to-layer="18" to-port="2" />
2648
+ <edge from-layer="18" from-port="3" to-layer="21" to-port="2" />
2649
+ <edge from-layer="19" from-port="0" to-layer="21" to-port="3" />
2650
+ <edge from-layer="20" from-port="0" to-layer="21" to-port="4" />
2651
+ <edge from-layer="21" from-port="5" to-layer="22" to-port="0" />
2652
+ <edge from-layer="22" from-port="1" to-layer="24" to-port="1" />
2653
+ <edge from-layer="23" from-port="0" to-layer="24" to-port="2" />
2654
+ <edge from-layer="24" from-port="3" to-layer="25" to-port="1" />
2655
+ <edge from-layer="25" from-port="2" to-layer="27" to-port="0" />
2656
+ <edge from-layer="26" from-port="0" to-layer="27" to-port="1" />
2657
+ <edge from-layer="27" from-port="2" to-layer="29" to-port="0" />
2658
+ <edge from-layer="28" from-port="0" to-layer="29" to-port="1" />
2659
+ <edge from-layer="29" from-port="2" to-layer="31" to-port="0" />
2660
+ <edge from-layer="30" from-port="0" to-layer="31" to-port="1" />
2661
+ <edge from-layer="31" from-port="2" to-layer="33" to-port="0" />
2662
+ <edge from-layer="31" from-port="2" to-layer="41" to-port="0" />
2663
+ <edge from-layer="31" from-port="2" to-layer="49" to-port="0" />
2664
+ <edge from-layer="31" from-port="2" to-layer="86" to-port="1" />
2665
+ <edge from-layer="32" from-port="0" to-layer="33" to-port="1" />
2666
+ <edge from-layer="33" from-port="2" to-layer="35" to-port="0" />
2667
+ <edge from-layer="34" from-port="0" to-layer="35" to-port="1" />
2668
+ <edge from-layer="35" from-port="2" to-layer="37" to-port="0" />
2669
+ <edge from-layer="36" from-port="0" to-layer="37" to-port="1" />
2670
+ <edge from-layer="37" from-port="2" to-layer="39" to-port="0" />
2671
+ <edge from-layer="38" from-port="0" to-layer="39" to-port="1" />
2672
+ <edge from-layer="39" from-port="2" to-layer="77" to-port="0" />
2673
+ <edge from-layer="40" from-port="0" to-layer="41" to-port="1" />
2674
+ <edge from-layer="41" from-port="2" to-layer="43" to-port="0" />
2675
+ <edge from-layer="42" from-port="0" to-layer="43" to-port="1" />
2676
+ <edge from-layer="43" from-port="2" to-layer="45" to-port="0" />
2677
+ <edge from-layer="44" from-port="0" to-layer="45" to-port="1" />
2678
+ <edge from-layer="45" from-port="2" to-layer="47" to-port="0" />
2679
+ <edge from-layer="46" from-port="0" to-layer="47" to-port="1" />
2680
+ <edge from-layer="47" from-port="2" to-layer="77" to-port="1" />
2681
+ <edge from-layer="48" from-port="0" to-layer="49" to-port="1" />
2682
+ <edge from-layer="49" from-port="2" to-layer="51" to-port="0" />
2683
+ <edge from-layer="50" from-port="0" to-layer="51" to-port="1" />
2684
+ <edge from-layer="51" from-port="2" to-layer="53" to-port="0" />
2685
+ <edge from-layer="52" from-port="0" to-layer="53" to-port="1" />
2686
+ <edge from-layer="53" from-port="2" to-layer="55" to-port="0" />
2687
+ <edge from-layer="54" from-port="0" to-layer="55" to-port="1" />
2688
+ <edge from-layer="55" from-port="2" to-layer="77" to-port="2" />
2689
+ <edge from-layer="56" from-port="0" to-layer="73" to-port="0" />
2690
+ <edge from-layer="57" from-port="0" to-layer="58" to-port="1" />
2691
+ <edge from-layer="57" from-port="0" to-layer="166" to-port="2" />
2692
+ <edge from-layer="58" from-port="2" to-layer="60" to-port="0" />
2693
+ <edge from-layer="59" from-port="0" to-layer="60" to-port="1" />
2694
+ <edge from-layer="60" from-port="2" to-layer="69" to-port="0" />
2695
+ <edge from-layer="61" from-port="0" to-layer="63" to-port="1" />
2696
+ <edge from-layer="62" from-port="0" to-layer="63" to-port="2" />
2697
+ <edge from-layer="63" from-port="3" to-layer="68" to-port="0" />
2698
+ <edge from-layer="64" from-port="0" to-layer="68" to-port="1" />
2699
+ <edge from-layer="65" from-port="0" to-layer="67" to-port="1" />
2700
+ <edge from-layer="66" from-port="0" to-layer="67" to-port="2" />
2701
+ <edge from-layer="67" from-port="3" to-layer="68" to-port="2" />
2702
+ <edge from-layer="68" from-port="3" to-layer="69" to-port="1" />
2703
+ <edge from-layer="69" from-port="2" to-layer="70" to-port="0" />
2704
+ <edge from-layer="70" from-port="1" to-layer="72" to-port="0" />
2705
+ <edge from-layer="71" from-port="0" to-layer="72" to-port="1" />
2706
+ <edge from-layer="72" from-port="2" to-layer="73" to-port="1" />
2707
+ <edge from-layer="73" from-port="2" to-layer="76" to-port="2" />
2708
+ <edge from-layer="73" from-port="2" to-layer="74" to-port="0" />
2709
+ <edge from-layer="74" from-port="1" to-layer="76" to-port="0" />
2710
+ <edge from-layer="75" from-port="0" to-layer="76" to-port="1" />
2711
+ <edge from-layer="76" from-port="3" to-layer="77" to-port="3" />
2712
+ <edge from-layer="76" from-port="3" to-layer="133" to-port="3" />
2713
+ <edge from-layer="77" from-port="4" to-layer="79" to-port="0" />
2714
+ <edge from-layer="78" from-port="0" to-layer="79" to-port="1" />
2715
+ <edge from-layer="79" from-port="2" to-layer="81" to-port="0" />
2716
+ <edge from-layer="80" from-port="0" to-layer="81" to-port="1" />
2717
+ <edge from-layer="81" from-port="2" to-layer="83" to-port="0" />
2718
+ <edge from-layer="82" from-port="0" to-layer="83" to-port="1" />
2719
+ <edge from-layer="83" from-port="2" to-layer="85" to-port="0" />
2720
+ <edge from-layer="84" from-port="0" to-layer="85" to-port="1" />
2721
+ <edge from-layer="85" from-port="2" to-layer="86" to-port="0" />
2722
+ <edge from-layer="86" from-port="2" to-layer="88" to-port="0" />
2723
+ <edge from-layer="87" from-port="0" to-layer="88" to-port="1" />
2724
+ <edge from-layer="88" from-port="2" to-layer="90" to-port="0" />
2725
+ <edge from-layer="89" from-port="0" to-layer="90" to-port="1" />
2726
+ <edge from-layer="90" from-port="2" to-layer="92" to-port="0" />
2727
+ <edge from-layer="91" from-port="0" to-layer="92" to-port="1" />
2728
+ <edge from-layer="92" from-port="2" to-layer="94" to-port="0" />
2729
+ <edge from-layer="92" from-port="2" to-layer="102" to-port="1" />
2730
+ <edge from-layer="93" from-port="0" to-layer="94" to-port="1" />
2731
+ <edge from-layer="94" from-port="2" to-layer="96" to-port="0" />
2732
+ <edge from-layer="95" from-port="0" to-layer="96" to-port="1" />
2733
+ <edge from-layer="96" from-port="2" to-layer="97" to-port="0" />
2734
+ <edge from-layer="97" from-port="1" to-layer="99" to-port="0" />
2735
+ <edge from-layer="98" from-port="0" to-layer="99" to-port="1" />
2736
+ <edge from-layer="99" from-port="2" to-layer="101" to-port="0" />
2737
+ <edge from-layer="100" from-port="0" to-layer="101" to-port="1" />
2738
+ <edge from-layer="101" from-port="2" to-layer="102" to-port="0" />
2739
+ <edge from-layer="102" from-port="2" to-layer="104" to-port="0" />
2740
+ <edge from-layer="103" from-port="0" to-layer="104" to-port="1" />
2741
+ <edge from-layer="104" from-port="2" to-layer="106" to-port="0" />
2742
+ <edge from-layer="105" from-port="0" to-layer="106" to-port="1" />
2743
+ <edge from-layer="106" from-port="2" to-layer="108" to-port="0" />
2744
+ <edge from-layer="107" from-port="0" to-layer="108" to-port="1" />
2745
+ <edge from-layer="108" from-port="2" to-layer="110" to-port="0" />
2746
+ <edge from-layer="108" from-port="2" to-layer="118" to-port="0" />
2747
+ <edge from-layer="108" from-port="2" to-layer="142" to-port="1" />
2748
+ <edge from-layer="108" from-port="2" to-layer="126" to-port="0" />
2749
+ <edge from-layer="109" from-port="0" to-layer="110" to-port="1" />
2750
+ <edge from-layer="110" from-port="2" to-layer="112" to-port="0" />
2751
+ <edge from-layer="111" from-port="0" to-layer="112" to-port="1" />
2752
+ <edge from-layer="112" from-port="2" to-layer="114" to-port="0" />
2753
+ <edge from-layer="113" from-port="0" to-layer="114" to-port="1" />
2754
+ <edge from-layer="114" from-port="2" to-layer="116" to-port="0" />
2755
+ <edge from-layer="115" from-port="0" to-layer="116" to-port="1" />
2756
+ <edge from-layer="116" from-port="2" to-layer="133" to-port="0" />
2757
+ <edge from-layer="117" from-port="0" to-layer="118" to-port="1" />
2758
+ <edge from-layer="118" from-port="2" to-layer="120" to-port="0" />
2759
+ <edge from-layer="119" from-port="0" to-layer="120" to-port="1" />
2760
+ <edge from-layer="120" from-port="2" to-layer="122" to-port="0" />
2761
+ <edge from-layer="121" from-port="0" to-layer="122" to-port="1" />
2762
+ <edge from-layer="122" from-port="2" to-layer="124" to-port="0" />
2763
+ <edge from-layer="123" from-port="0" to-layer="124" to-port="1" />
2764
+ <edge from-layer="124" from-port="2" to-layer="133" to-port="1" />
2765
+ <edge from-layer="125" from-port="0" to-layer="126" to-port="1" />
2766
+ <edge from-layer="126" from-port="2" to-layer="128" to-port="0" />
2767
+ <edge from-layer="127" from-port="0" to-layer="128" to-port="1" />
2768
+ <edge from-layer="128" from-port="2" to-layer="130" to-port="0" />
2769
+ <edge from-layer="129" from-port="0" to-layer="130" to-port="1" />
2770
+ <edge from-layer="130" from-port="2" to-layer="132" to-port="0" />
2771
+ <edge from-layer="131" from-port="0" to-layer="132" to-port="1" />
2772
+ <edge from-layer="132" from-port="2" to-layer="133" to-port="2" />
2773
+ <edge from-layer="133" from-port="4" to-layer="135" to-port="0" />
2774
+ <edge from-layer="134" from-port="0" to-layer="135" to-port="1" />
2775
+ <edge from-layer="135" from-port="2" to-layer="137" to-port="0" />
2776
+ <edge from-layer="136" from-port="0" to-layer="137" to-port="1" />
2777
+ <edge from-layer="137" from-port="2" to-layer="139" to-port="0" />
2778
+ <edge from-layer="138" from-port="0" to-layer="139" to-port="1" />
2779
+ <edge from-layer="139" from-port="2" to-layer="141" to-port="0" />
2780
+ <edge from-layer="140" from-port="0" to-layer="141" to-port="1" />
2781
+ <edge from-layer="141" from-port="2" to-layer="142" to-port="0" />
2782
+ <edge from-layer="142" from-port="2" to-layer="144" to-port="0" />
2783
+ <edge from-layer="143" from-port="0" to-layer="144" to-port="1" />
2784
+ <edge from-layer="144" from-port="2" to-layer="146" to-port="0" />
2785
+ <edge from-layer="145" from-port="0" to-layer="146" to-port="1" />
2786
+ <edge from-layer="146" from-port="2" to-layer="148" to-port="0" />
2787
+ <edge from-layer="147" from-port="0" to-layer="148" to-port="1" />
2788
+ <edge from-layer="148" from-port="2" to-layer="150" to-port="0" />
2789
+ <edge from-layer="148" from-port="2" to-layer="158" to-port="1" />
2790
+ <edge from-layer="149" from-port="0" to-layer="150" to-port="1" />
2791
+ <edge from-layer="150" from-port="2" to-layer="152" to-port="0" />
2792
+ <edge from-layer="151" from-port="0" to-layer="152" to-port="1" />
2793
+ <edge from-layer="152" from-port="2" to-layer="153" to-port="0" />
2794
+ <edge from-layer="153" from-port="1" to-layer="155" to-port="0" />
2795
+ <edge from-layer="154" from-port="0" to-layer="155" to-port="1" />
2796
+ <edge from-layer="155" from-port="2" to-layer="157" to-port="0" />
2797
+ <edge from-layer="156" from-port="0" to-layer="157" to-port="1" />
2798
+ <edge from-layer="157" from-port="2" to-layer="158" to-port="0" />
2799
+ <edge from-layer="158" from-port="2" to-layer="160" to-port="0" />
2800
+ <edge from-layer="159" from-port="0" to-layer="160" to-port="1" />
2801
+ <edge from-layer="160" from-port="2" to-layer="162" to-port="0" />
2802
+ <edge from-layer="161" from-port="0" to-layer="162" to-port="1" />
2803
+ <edge from-layer="162" from-port="2" to-layer="164" to-port="0" />
2804
+ <edge from-layer="163" from-port="0" to-layer="164" to-port="1" />
2805
+ <edge from-layer="164" from-port="2" to-layer="166" to-port="0" />
2806
+ <edge from-layer="165" from-port="0" to-layer="166" to-port="1" />
2807
+ <edge from-layer="166" from-port="3" to-layer="168" to-port="0" />
2808
+ <edge from-layer="167" from-port="0" to-layer="168" to-port="1" />
2809
+ <edge from-layer="168" from-port="2" to-layer="170" to-port="0" />
2810
+ <edge from-layer="169" from-port="0" to-layer="170" to-port="1" />
2811
+ <edge from-layer="170" from-port="2" to-layer="171" to-port="0" />
2812
+ <edge from-layer="171" from-port="1" to-layer="173" to-port="0" />
2813
+ <edge from-layer="172" from-port="0" to-layer="173" to-port="1" />
2814
+ <edge from-layer="173" from-port="2" to-layer="174" to-port="0" />
2815
+ </edges>
2816
+ <rt_info>
2817
+ <Runtime_version value="2024.4.1-16618-643f23d1318-releases/2024/4" />
2818
+ <conversion_parameters>
2819
+ <framework value="pytorch" />
2820
+ <is_python_object value="True" />
2821
+ </conversion_parameters>
2822
+ <optimum>
2823
+ <optimum_intel_version value="1.20.1" />
2824
+ <optimum_version value="1.24.0" />
2825
+ <pytorch_version value="2.6.0+cu124" />
2826
+ <transformers_version value="4.52.0.dev0" />
2827
+ </optimum>
2828
+ </rt_info>
2829
+ </net>