|
--- |
|
language: |
|
- en |
|
- ru |
|
license: mit |
|
library_name: other |
|
tags: |
|
- spiking-neural-network |
|
- object-detection |
|
- event-based |
|
- yolo |
|
- neuromorphic |
|
pipeline_tag: object-detection |
|
--- |
|
|
|
# TWL Spike Yolo |
|
|
|
**TWL Spike Yolo** is a spiking neural network (SNN) for real-time object detection based on event-based vision. The model adapts the YOLOv8 architecture to work with streams of event data, allowing efficient processing in neuromorphic computing environments. |
|
|
|
<p align="center"> |
|
<img src="https://huggingface.co/KirillHit/twl_spike_yolo/resolve/main/assets/gen1_example.gif" alt="Demo GIF"/><br> |
|
<em>Demonstration of model performance on the Gen1 dataset</em> |
|
</p> |
|
|
|
This approach leverages the low-latency and power-efficient properties of SNNs to detect objects in fast-changing visual scenes. The model also explores multimodal fusion by combining event-based and frame-based inputs to enhance detection accuracy under challenging conditions such as motion blur or low light. |
|
|
|
## Highlights |
|
|
|
- **Architecture**: YOLOv8-inspired spiking neural network. |
|
- **Input**: Event data from neuromorphic (event-based) cameras; optionally combined with standard image frames. |
|
- **Use case**: High-speed, low-latency object detection with improved energy efficiency. |
|
- **Applications**: Robotics, autonomous driving, surveillance, and edge devices using neuromorphic hardware. |
|
|
|
## Source Code |
|
|
|
The implementation, training scripts, and inference tools are available in the GitHub repository: |
|
๐ [https://github.com/KirillHit/twl_spike_yolo](https://github.com/KirillHit/twl_spike_yolo) |
|
|
|
--- |