Datasets:
Update README.md
Browse files
README.md
CHANGED
@@ -128,6 +128,27 @@ rmse = mean_squared_error(y_test, preds, squared=False)
|
|
128 |
print(f"RMSE: {rmse}")
|
129 |
```
|
130 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
131 |
---
|
132 |
|
133 |
### Loading HLO Graph Features
|
|
|
128 |
print(f"RMSE: {rmse}")
|
129 |
```
|
130 |
|
131 |
+
|
132 |
+
---
|
133 |
+
|
134 |
+
## Example Notebooks
|
135 |
+
### 🚀 Interactive Baseline: XGBoost for Resource Estimation
|
136 |
+
|
137 |
+
We provide a sample baseline implementation using **XGBoost** to demonstrate how to perform resource estimation (e.g., predicting `fit_time`) using the dataset's metadata.
|
138 |
+
|
139 |
+
You can interactively explore and run this notebook on Google Colab:
|
140 |
+
|
141 |
+
[](https://huggingface.co/datasets/ICICLE-AI/ResourceEstimation_HLOGenCNN/blob/main/Baseline_XGBoost_Resource_Estimation.ipynb)
|
142 |
+
**Baseline_XGBoost_Resource_Estimation.ipynb**
|
143 |
+
|
144 |
+
This notebook covers:
|
145 |
+
- Loading and preprocessing metadata from `dataset-new.csv`
|
146 |
+
- Training an XGBoost regressor to predict training time
|
147 |
+
- Evaluating model performance (e.g., RMSE)
|
148 |
+
- Guidance for extending to advanced models (e.g., incorporating HLO graph features)
|
149 |
+
|
150 |
+
> ⚡ **Note:** Make sure to adjust paths if cloning the dataset locally or integrating with Hugging Face `datasets` API.
|
151 |
+
|
152 |
---
|
153 |
|
154 |
### Loading HLO Graph Features
|