frankmorales2020's picture
Update README.md
aa52f48 verified
metadata
license: apache-2.0
tags:
  - flight-planning
  - transformer
  - coordinate-prediction
  - sequence-to-sequence
  - count-classification

Flight Plan Coordinate Prediction Model (Seq2SeqCoordsTransformer)

The encoder-decoder transformer model was trained for an AI flight planning project. Predicts normalized coordinates directly and waypoint count via classification.

Model Description

Seq2SeqCoordsTransformer architecture using torch.nn.Transformer. Predicts normalized lat/lon coordinates autoregressively and waypoint count (0-10) via classification head on encoder output.

  • Embed Dim: 256, Heads: 8, Enc Layers: 4, Dec Layers: 4, Max Waypoints: 10

Intended Use

Research prototype. Not for real-world navigation.

Limitations

Accuracy depends on data/tuning. Fixed max waypoints (10). Not certified. Architecture differs significantly from previous versions in this repo.

How to Use

This requires loading the custom Seq2SeqCoordsTransformer class and weights. Generation requires autoregressive decoding and taking the argmax of the count logits.

Read this article - https://medium.com/ai-simplified-in-plain-english/building-a-transformer-model-with-seq2seq-architecture-for-flight-planning-0bdd1fecaefe

Training Data

Trained on frankmorales2020/flight_plan_waypoints - https://huggingface.co/datasets/frankmorales2020/flight_plan_waypoints.

Contact

Frank Morales, BEng, MEng, SMIEEE (Boeing ATF) - https://www.linkedin.com/in/frank-morales1964/