MuzzammilShah commited on
Commit
da77ccb
Β·
verified Β·
1 Parent(s): 525fbd4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +31 -71
README.md CHANGED
@@ -1,72 +1,32 @@
1
- ## SET 1 - MICROGRAD πŸ”—
2
-
3
- [![Documentation](https://img.shields.io/badge/Documentation-Available-blue)](https://muzzammilshah.github.io/Road-to-GPT/Micrograd/)
4
- ![Number of Commits](https://img.shields.io/github/commit-activity/m/MuzzammilShah/NeuralNetworks-Micrograd-Implementation?label=Commits)
5
- [![Last Commit](https://img.shields.io/github/last-commit/MuzzammilShah/NeuralNetworks-Micrograd-Implementation.svg?style=flat)](https://github.com/MuzzammilShah/NeuralNetworks-Micrograd-Implementation/commits/main)
6
- ![Project Status](https://img.shields.io/badge/Status-Done-success)
7
-
8
-  
9
-
10
- ### **Overview**
11
- This repository contains the implementation of **Backpropagation** using an **AutoGrad Engine**, inspired by the **Micrograd** video by Andrej Karpathy. It explores the foundations of training neural networks and implementing key operations from scratch.
12
-
13
- The repository contains:
14
-
15
- - **Manual Backpropagation**: Building intuition and understanding of the gradient calculation process.
16
- - **Interactive Site Version**: A pilot version of an interactive site that visualizes the functionality, currently under development.
17
-
18
- ✍🏻 Notes: Follow the notebooks in order for a structured learning path. Each notebook and note corresponds to a particular concept or milestone in the implementation.
19
-
20
-  
21
-
22
- ### **πŸ—‚οΈRepository Structure**
23
-
24
- ```plaintext
25
- β”œβ”€β”€ .gitignore
26
- β”œβ”€β”€ README.md
27
- β”œβ”€β”€ notes/
28
- β”‚ β”œβ”€β”€ A-main-video-lecture-notes.md
29
- β”‚ β”œβ”€β”€ chatgpt-motivation.md
30
- β”‚ β”œβ”€β”€ crux-node-backpropagation.md
31
- β”‚ β”œβ”€β”€ expanding-tanh-and-adding-more-operations.md
32
- β”‚ β”œβ”€β”€ micrograd-functionality.md
33
- β”‚ β”œβ”€β”€ multi-layer-perceptron.md
34
- β”‚ β”œβ”€β”€ neurons-explanation.md
35
- β”‚ β”œβ”€β”€ pytorch-comparision.md
36
- β”‚ └── value-object-creation.md
37
- β”œβ”€β”€ site/
38
- β”‚ β”œβ”€β”€ interactive_site_pilot_v1.2/
39
- β”œβ”€β”€ 1-derivative-simple-function.ipynb
40
- β”œβ”€β”€ 2-derivative-function-with-multiple-inputs.ipynb
41
- β”œβ”€β”€ 3-value-object.ipynb
42
- β”œβ”€β”€ 3_1-graph-visualisation.ipynb
43
- β”œβ”€β”€ 4_0-manual-backpropagation_simpleExpression.ipynb
44
- β”œβ”€β”€ ... (more implementation notebooks, there are a lot lol)
45
- ```
46
-
47
- - **Notes Directory**: Contains Markdown files with notes and explanations for each topic.
48
- - **Interactive Site Directory**: Contains files for the pilot version of the interactive visualization tool.
49
- - **Implementation Notebooks**: Step-by-step code for implementing and understanding backpropagation and related concepts.
50
-
51
-  
52
-
53
- ### **πŸ“„Instructions**
54
-
55
- 1. Start by reading the notes in the `notes/` directory for a theoretical understanding.
56
- 2. Proceed with the notebooks in the root directory in order to build up the implementation step by step.
57
- 3. Explore the `site/` directory for the pilot interactive version of the AutoGrad Engine visualization (Idea concept, not yet implemented)
58
-
59
-  
60
-
61
- ### **⭐Documentation**
62
-
63
- For a better reading experience and detailed notes, visit my **[Road to GPT Documentation Site](https://muzzammilshah.github.io/Road-to-GPT/Micrograd/)**.
64
-
65
- > **πŸ’‘Pro Tip**: This site provides an interactive and visually rich explanation of the notes and code. It is highly recommended you view this project from there.
66
-
67
-  
68
-
69
- ### **✍🏻Acknowledgments**
70
- Notes and implementations inspired by the **Micrograd** video by [Andrej Karpathy](https://karpathy.ai/).
71
-
72
  For more of my projects, visit my [Portfolio Site](https://muhammedshah.com).
 
1
+ ---
2
+ license: mit
3
+ datasets: []
4
+ language:
5
+ - en
6
+ model_name: Micrograd AutoGrad Engine
7
+ library_name: pytorch
8
+ tags:
9
+ - micrograd
10
+ - autograd
11
+ - backpropagation
12
+ - neural-networks
13
+ - andrej-karpathy
14
+ ---
15
+
16
+ # Micrograd AutoGrad Engine: Backpropagation Implementation
17
+
18
+ This repository contains the implementation of **Backpropagation** using an **AutoGrad Engine**, inspired by the **Micrograd** video by Andrej Karpathy. It explores the foundations of training neural networks and implementing key operations from scratch.
19
+
20
+ ## Overview
21
+ - **Manual Backpropagation**: Building intuition and understanding of the gradient calculation process.
22
+ - **Implementation Notebooks**: Step-by-step code for implementing and understanding backpropagation and related concepts.
23
+
24
+ ## Documentation
25
+ For a better reading experience and detailed notes, visit my **[Road to GPT Documentation Site](https://muzzammilshah.github.io/Road-to-GPT/Micrograd/)**.
26
+
27
+ > **πŸ’‘ Pro Tip**: This site provides an interactive and visually rich explanation of the notes and code. It is highly recommended you view this project from there.
28
+
29
+ ## Acknowledgments
30
+ Notes and implementations inspired by the **Micrograd** video by [Andrej Karpathy](https://karpathy.ai/).
31
+
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
32
  For more of my projects, visit my [Portfolio Site](https://muhammedshah.com).