Update README.md
Browse files
README.md
CHANGED
@@ -31,8 +31,6 @@ Trained on a comprehensive dataset comprising 262,000 rows of paired natural lan
|
|
31 |
|
32 |
## Training procedure
|
33 |
|
34 |
-
To detail the training procedure for AI2SQL, especially considering its specialized task of converting natural language questions to SQL queries and its basis on the Falcon-7b-instruct model, the following section can be included in the model card:
|
35 |
-
|
36 |
### Overview
|
37 |
AI2SQL was trained in a multi-stage process, starting with a pre-trained Falcon-7b-instruct model, a large transformer-based language model. This base model was then fine-tuned using a Parameter Efficient Fine-Tuning (PEFT) approach with Locally Reweighted Approximations (LoRA) specifically for the task of translating natural language to SQL queries.
|
38 |
|
|
|
31 |
|
32 |
## Training procedure
|
33 |
|
|
|
|
|
34 |
### Overview
|
35 |
AI2SQL was trained in a multi-stage process, starting with a pre-trained Falcon-7b-instruct model, a large transformer-based language model. This base model was then fine-tuned using a Parameter Efficient Fine-Tuning (PEFT) approach with Locally Reweighted Approximations (LoRA) specifically for the task of translating natural language to SQL queries.
|
36 |
|