Neural Netowork Implementation from Scratch

Developed a neural network library from scratch in JavaScript, implementing core machine learning concepts such as feedforward and backpropagation in two versions. The first version uses standard matrix operations with manual layer-by-layer gradient calculations for backpropagation, while the second version leverages network topology and an automatic differentiation mechanism similar to PyTorch (inspired by Andrej Karpathy's Micrograd). Both versions implement stochastic gradient descent (SGD) to optimize the model's parameters, enabling efficient training.

This project enables building customizable neural networks for a wide range of applications, offering flexibility in the number of layers, neurons, and activation functions. It allows users to experiment with different architectures, train models on diverse datasets, and gain insights into the internal workings of neural networks, making it a valuable tool for both learning and practical machine learning implementations in JavaScript environments.