This project implements a basic neural network from scratch using NumPy. It includes synthetic data generation, a feedforward neural network with backpropagation, and visualization tools. The network supports gradient descent optimization with ReLU and Softmax activation functions, along with additional features like loss calculation, accuracy evaluation, and dynamic learning rate adjustment.
- Synthetic Data Generation: Generates a spiral dataset for multi-class classification.
- Fully Connected Neural Network: Implements dense layers with forward and backward propagation.
- Activation Functions:
- ReLU: Rectified Linear Unit for hidden layers.
- Softmax: Activation for the output layer for multi-class classification.
- Loss Function: Categorical Cross-Entropy Loss with gradient computation.
- Training Features:
- Backpropagation with gradient descent optimization.
- Dynamic learning rate decay using exponential decay.
- Accuracy calculation to monitor model performance.
- Visualization: Plotting tools for dataset and training insights.
.
├── docs
├── src
│ ├── layers
│ │ ├── dataset.py # Generates synthetic spiral dataset
│ │ ├── loss.py # Loss function implementation (e.g., cross-entropy)
│ │ ├── Network.py # Core neural network implementation
│ │ ├── plot.py # Visualization utilities for datasets
│ │ ├── softmax.py # Softmax activation example
│ ├── utils
│ ├── network_data.json # Sample JSON data for network configurations
│ ├── network_data.py # Handles network configuration data
│ ├── Visualization.py # Visualization of neural network structure
│ ├── __init__.py
├── .gitignore
├── readme.md
├── requirements.txt
Generates a spiral dataset with customizable parameters:
- Number of classes
- Samples per class
- Random noise for realistic data distribution
Core neural network implementation:
- Layer_Dense: Implements fully connected layers.
- Activation_ReLU: ReLU activation for non-linearity.
- Activation_Softmax: Softmax activation for output classification.
- Loss_CategoricalCrossEntropy: Computes loss and gradients.
- Learning Rate Decay: Adjusts learning rate dynamically using exponential decay.
- Accuracy Calculation: Evaluates model accuracy against true labels.
Handles loss functions such as categorical cross-entropy with gradient computation.
Visualizes the generated spiral dataset using Matplotlib.
Demonstrates the behavior of the softmax activation function.
Manages configurations and parameters of the neural network stored in network_data.json
.
Provides an interactive visualization of the neural network structure using Plotly.
- Clone the repository:
git clone https://github.com/your-repo/neural-network-from-scratch.git cd neural-network-from-scratch
- Install dependencies:
pip install -r requirements.txt
Run the Network.py
script to train the model:
python src/layers/Network.py
This performs the following steps:
- Generates a synthetic spiral dataset.
- Initializes the neural network with two dense layers.
- Trains the model for a specified number of epochs.
- Displays the following outputs:
- Softmax probabilities for the first 5 samples.
- Final loss value.
- Model accuracy.
To visualize the synthetic dataset:
python src/layers/plot.py
To generate an interactive visualization of the neural network architecture:
python src/utils/Visualization.py
This will create an interactive HTML file showing the network's layers, nodes, and connections.
- Implements exponential learning rate decay during training.
- Full forward and backward implementation for gradient computation.
- Adds accuracy evaluation by comparing predictions to true labels.
- Introduces Plotly-based visualization to display the network architecture dynamically.
- Incorporates stochastic gradient descent as an optimization technique during training.
Planned enhancements include:
- Advanced optimizers (e.g., Adam, RMSProp)
- Additional activation functions (e.g., Tanh, Sigmoid)
- Model evaluation metrics (e.g., precision, recall, F1-score)
- Saving and loading trained models
- Integration of batch processing
- Comprehensive testing suite
When training the network, sample output includes:
...
Softmax probabilities for first 5 samples:
[[0.05, 0.90, 0.05], [0.10, 0.80, 0.10], ...]
Final Loss: 0.3452
Final Accuracy: 92.0%
- NumPy
- Matplotlib
- Plotly
Install them via:
pip install numpy matplotlib plotly
We welcome contributions to improve this project! To contribute:
- Fork the repository.
- Create a feature branch:
git checkout -b feature/your-feature-name
- Commit your changes and submit a pull request.
- Ensure your code is well-documented and tested.
For more details, refer to CONTRIBUTING.md.
This project is licensed under the MIT License. See the LICENSE file for more details.
Happy Coding! 🚀