SecRoBERTa

This is the pretrained model presented in SecBERT: A Pretrained Language Model for Cyber Security Text, which is a SecRoBERTa model trained on cyber security text.

The training corpus was papers taken from

SecRoBERTa has its own wordpiece vocabulary (secvocab) that's built to best match the training corpus.

We trained SecBERT and SecRoBERTa versions.

Available models include:


Fill Mask

We proposed to build language model which work on cyber security text, as result, it can improve downstream tasks (NER, Text Classification, Semantic Understand, Q&A) in Cyber Security Domain.

First, as below shows Fill-Mask pipeline in Google Bert, AllenAI SciBert and our SecBERT .

fill-mask-result

The original repo can be found here.

Downloads last month
3,003
Safetensors
Model size
84.1M params
Tensor type
I64
Β·
F32
Β·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Spaces using jackaduma/SecRoBERTa 2