Skip to content

A lightweight Neural Network framework for creating MLPs inspired by micrograd from @karpathy

License

Notifications You must be signed in to change notification settings

ethangilmore/MicroMLP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MicroMLP

Overview

MicroMLP is a tiny, lightweight python framework for creating and training feed-forward neural networks. MicroMLP works with custom activation and loss functions, but comes with the most common already provided. There is no support for hardware acceleration, but micromlp makes use of numpy for efficient matrix operations.

Example Usage

MicroMLP provides a simple API inspired by the Keras Sequential API found in Tensorflow.

import micromlp
from micromlp.activations import relu, softmax

model = MLP([
    Layer(2, 8, relu),
    Layer(8, 8, relu),
    Layer(8, 2, softmax)
])

And training the model is as simple as a single line of code.

from micromlp.losses import cross_entropy

model.train(xs, ys, loss=cross_entropy, epochs=20, batch_size=32, learning_rate=0.01)

License

MIT

About

A lightweight Neural Network framework for creating MLPs inspired by micrograd from @karpathy

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published