Text Classification
Transformers
Safetensors
English
bert
fill-mask
BERT
NeuroBERT
transformer
nlp
neurobert
edge-ai
low-resource
micro-nlp
quantized
iot
wearable-ai
offline-assistant
intent-detection
real-time
smart-home
embedded-systems
command-classification
toy-robotics
voice-ai
eco-ai
english
lightweight
mobile-nlp
ner
Update README.md
Browse files
README.md
CHANGED
@@ -46,7 +46,7 @@ library_name: transformers
|
|
46 |
# 🧠 NeuroBERT — The Brain of Lightweight NLP for Real-World Intelligence 🌍
|
47 |
|
48 |
[](https://opensource.org/licenses/MIT)
|
49 |
-
[](#)
|
51 |
[](#)
|
52 |
|
@@ -72,10 +72,10 @@ library_name: transformers
|
|
72 |
|
73 |
## Overview
|
74 |
|
75 |
-
`NeuroBERT` is an **advanced lightweight** NLP model derived from **google/bert-base-uncased**, optimized for **real-time inference** on **resource-constrained devices**. With a quantized size of **~
|
76 |
|
77 |
- **Model Name**: NeuroBERT
|
78 |
-
- **Size**: ~
|
79 |
- **Parameters**: ~30M
|
80 |
- **Architecture**: Advanced BERT (8 layers, hidden size 256, 4 attention heads)
|
81 |
- **Description**: Advanced 8-layer, 256-hidden
|
@@ -83,7 +83,7 @@ library_name: transformers
|
|
83 |
|
84 |
## Key Features
|
85 |
|
86 |
-
- ⚡ **Lightweight Powerhouse**: ~
|
87 |
- 🧠 **Deep Contextual Understanding**: Captures complex semantic relationships with an 8-layer architecture.
|
88 |
- 📶 **Offline Capability**: Fully functional without internet access.
|
89 |
- ⚙️ **Real-Time Inference**: Optimized for CPUs, mobile NPUs, and microcontrollers.
|
@@ -97,13 +97,13 @@ Install the required dependencies:
|
|
97 |
pip install transformers torch
|
98 |
```
|
99 |
|
100 |
-
Ensure your environment supports Python 3.6+ and has ~
|
101 |
|
102 |
## Download Instructions
|
103 |
|
104 |
1. **Via Hugging Face**:
|
105 |
- Access the model at [boltuix/NeuroBERT](https://huggingface.co/boltuix/NeuroBERT).
|
106 |
-
- Download the model files (~
|
107 |
```bash
|
108 |
git clone https://huggingface.co/boltuix/NeuroBERT
|
109 |
```
|
@@ -293,7 +293,7 @@ NeuroBERT is designed for **real-world intelligence** in **edge and IoT scenario
|
|
293 |
## Hardware Requirements
|
294 |
|
295 |
- **Processors**: CPUs, mobile NPUs, or microcontrollers (e.g., Raspberry Pi, ESP32-S3)
|
296 |
-
- **Storage**: ~
|
297 |
- **Memory**: ~120MB RAM for inference
|
298 |
- **Environment**: Offline or low-connectivity settings
|
299 |
|
@@ -391,7 +391,7 @@ To adapt NeuroBERT for custom IoT tasks (e.g., specific smart home commands):
|
|
391 |
|
392 |
| Model | Parameters | Size | Edge/IoT Focus | Tasks Supported |
|
393 |
|-----------------|------------|--------|----------------|-------------------------|
|
394 |
-
| NeuroBERT | ~30M | ~
|
395 |
| NeuroBERT-Small | ~20M | ~50MB | High | MLM, NER, Classification |
|
396 |
| NeuroBERT-Mini | ~7M | ~35MB | High | MLM, NER, Classification |
|
397 |
| NeuroBERT-Tiny | ~4M | ~15MB | High | MLM, NER, Classification |
|
|
|
46 |
# 🧠 NeuroBERT — The Brain of Lightweight NLP for Real-World Intelligence 🌍
|
47 |
|
48 |
[](https://opensource.org/licenses/MIT)
|
49 |
+
[](#)
|
50 |
[](#)
|
51 |
[](#)
|
52 |
|
|
|
72 |
|
73 |
## Overview
|
74 |
|
75 |
+
`NeuroBERT` is an **advanced lightweight** NLP model derived from **google/bert-base-uncased**, optimized for **real-time inference** on **resource-constrained devices**. With a quantized size of **~57MB** and **~30M parameters**, it delivers powerful contextual language understanding for real-world applications in environments like mobile apps, wearables, microcontrollers, and smart home devices. Designed for **low-latency**, **offline operation**, and **real-world intelligence**, it’s ideal for privacy-first applications requiring robust intent detection, classification, and semantic understanding with limited connectivity.
|
76 |
|
77 |
- **Model Name**: NeuroBERT
|
78 |
+
- **Size**: ~57MB (quantized)
|
79 |
- **Parameters**: ~30M
|
80 |
- **Architecture**: Advanced BERT (8 layers, hidden size 256, 4 attention heads)
|
81 |
- **Description**: Advanced 8-layer, 256-hidden
|
|
|
83 |
|
84 |
## Key Features
|
85 |
|
86 |
+
- ⚡ **Lightweight Powerhouse**: ~57MB footprint fits devices with constrained storage while offering advanced NLP capabilities.
|
87 |
- 🧠 **Deep Contextual Understanding**: Captures complex semantic relationships with an 8-layer architecture.
|
88 |
- 📶 **Offline Capability**: Fully functional without internet access.
|
89 |
- ⚙️ **Real-Time Inference**: Optimized for CPUs, mobile NPUs, and microcontrollers.
|
|
|
97 |
pip install transformers torch
|
98 |
```
|
99 |
|
100 |
+
Ensure your environment supports Python 3.6+ and has ~57MB of storage for model weights.
|
101 |
|
102 |
## Download Instructions
|
103 |
|
104 |
1. **Via Hugging Face**:
|
105 |
- Access the model at [boltuix/NeuroBERT](https://huggingface.co/boltuix/NeuroBERT).
|
106 |
+
- Download the model files (~57MB) or clone the repository:
|
107 |
```bash
|
108 |
git clone https://huggingface.co/boltuix/NeuroBERT
|
109 |
```
|
|
|
293 |
## Hardware Requirements
|
294 |
|
295 |
- **Processors**: CPUs, mobile NPUs, or microcontrollers (e.g., Raspberry Pi, ESP32-S3)
|
296 |
+
- **Storage**: ~57MB for model weights (quantized for reduced footprint)
|
297 |
- **Memory**: ~120MB RAM for inference
|
298 |
- **Environment**: Offline or low-connectivity settings
|
299 |
|
|
|
391 |
|
392 |
| Model | Parameters | Size | Edge/IoT Focus | Tasks Supported |
|
393 |
|-----------------|------------|--------|----------------|-------------------------|
|
394 |
+
| NeuroBERT | ~30M | ~57MB | High | MLM, NER, Classification |
|
395 |
| NeuroBERT-Small | ~20M | ~50MB | High | MLM, NER, Classification |
|
396 |
| NeuroBERT-Mini | ~7M | ~35MB | High | MLM, NER, Classification |
|
397 |
| NeuroBERT-Tiny | ~4M | ~15MB | High | MLM, NER, Classification |
|