A minimalistic neural network implementation in Zig for learning purposes. This project focuses on understanding the fundamentals of neural networks including hidden layers, activation functions, and basic training processes.
- Matrix operations from scratch
- Common activation functions (Sigmoid, Linear, ReLU, Tanh)
- Advanced activation functions (Swish, GLU, SwiGLU)
- Basic feed-forward neural network architecture
- Support for gated architectures used in modern transformer models
See examples.
Detailed documentation is available in the docs
directory:
- Neural Network Architecture - Design principles and implementation details
- Advanced Activation Functions - Detailed information about activation functions
Make sure you have Zig installed on your system. This project is developed with the latest stable version of Zig.
For convenience, a Makefile is provided with common operations:
# Build and run everything (build, test, examples)
make
# Run specific examples, for example:
make example-simple-xor
# Build with different optimization modes
make BUILD_MODE=ReleaseFast
make release # Build with ReleaseSafe mode
# See all available commands
make help
The project includes a comprehensive test suite to verify the functionality of all components. The build system is configured to run tests for each module sequentially, making it easy to identify which component has issues.
# Run all tests
make test
# Run tests for specific components, for example:
zig build test-matrix # Run matrix operation tests
zig build test-activation # Run activation function tests
zig build test-layer # Run neural network layer tests
zig build test-network # Run full network tests
This project serves as a learning exercise for:
- Understanding neural network fundamentals
- Implementing mathematical operations in Zig
- Working with Zig's memory management and error handling
MIT License