ASL-TFLite-Edge

This repository contains a TensorFlow Lite model trained to recognize American Sign Language (ASL) fingerspelling gestures using hand landmark data. The model is optimized for real-time inference on edge devices.

🧠 Model Details

  • Format: TensorFlow Lite (.tflite)
  • Input: 64x64 RGB image (generated from hand landmarks via Mediapipe)
  • Output: Softmax probabilities over 59 ASL character classes (including a padding token)
  • Frameworks: TensorFlow, Mediapipe

πŸ“ Files Included

  • asl_model.tflite – The TFLite model file for ASL recognition
  • inference_args.json – JSON file containing the selected columns used for inference from parquet data
  • tflite_inference.py – Inference script to run predictions from raw .parquet landmark files

πŸš€ How to Run Inference

You can download and load the TFLite model directly from Hugging Face using the huggingface_hub library.

Clone the image

git lfs install
git clone https://huggingface.co/ColdSlim/ASL-TFLite-Edge
cd ASL-TFLite-Edge

Requirements

pip install -r requirements.txt

Run the Script

python tflite_inference.py path/to/sample.parquet

Output

Predicted class index: 5

πŸ” You can map this class index back to the character using your char_to_num mapping used during training.

πŸ“Œ Example Workflow

  1. Extract right-hand landmark data from Mediapipe and store it in a .parquet file.

  2. Ensure it contains the same selected_columns as in inference_args.json.

  3. Run tflite_inference.py to get the predicted class.

🧾 License

This project is licensed under the Apache 2.0 License.

πŸ‘¨β€πŸ’» Author

Developed by Manik Sheokand

For sign language fingerspelling Recognition on edge devices using TensorFlow Lite

Downloads last month
30
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support