In this article, a Feed-Forward Neural Network (FFN) is built using only numpy to identify handwritten digits from the MNIST dataset.Prior proficiency in Python is assumed, with a recommendation to refer to 3Blue1Brown’s videos on neural networks for beginners.The creation involves understanding neuron functions, weights, biases, and implementing them in code step by step.Introduction of non-linearity using activation functions like ReLU for better neural network performance and learning.Exploration of loss functions such as cross-entropy to evaluate model performance and feedback for improvement.Backpropagation process involves computing gradients, updating weights and biases using derivatives and softmax functions.Training the network involves dividing data into batches for efficiency and avoiding issues like exploding gradients.Testing and adjusting the epochs to balance between accuracy and overfitting to fine-tune the neural network model.The article concludes with successfully building a neural network achieving 92.45% accuracy in identifying handwritten digits.Readers are encouraged to experiment with parameters, test on different datasets, and further explore deep learning topics for continuous learning.