A neural network (NN) is a collection of layers of neurons, inspired by the human brain. The simplest form is a shallow neural network, but modern deep learning uses deep neural networks (DNNs) with multiple hidden layers.
🔹 Logistic Regression is the foundation of binary classification.
🔹 Perceptrons act as simple classifiers but lack flexibility.
🔹 Sigmoid, ReLU, and Tanh activation functions help in decision-making.
📌 Forward Propagation — Computes predictions layer by layer.
📌 Backward Propagation — Adjusts weights to minimize the error.
📌 Gradient Descent — Optimizes the learning process iteratively.
🔥 Pro Tip: ReLU (Rectified Linear Unit) is the most widely used activation function as it speeds up training.