Skip to content

Releases: gabrielmfern/intricate

Intricate v0.7.0

21 Dec 16:07
Compare
Choose a tag to compare

This new release took some time but it also offers a junk load of new features and bug fixes.

What's changed

  • Implement a way to get the MNIST dataset;
  • implement initiliazers and make it possible to choose each initializer for each Layer's parameter
  • Implement the ADAM optimizer which works wonders compared with other optimizers already implemented
  • Implement a builder pattern to set training options
  • Fix some problems with the Categorical Cross Entropy loss function and optimize it for softmax
  • Implement a Conv2D layer with some limitations and that still needs optimization
  • Implement a MNIST example in the examples/ folder of the repo
  • Made a logo for Intricate which is pretty cool

Full Changelog: v0.6.4...v0.7.0

Intricate v0.6.4

05 Sep 22:01
Compare
Choose a tag to compare

Just some minor bug fixes.

  • Fix a problem with the epoch progress bar that was not properly printing the estimated time to finish
  • Remove a dbg! I left on the code

Full Changelog: v0.6.3...v0.6.4

Intricate v0.6.3

05 Sep 21:49
Compare
Choose a tag to compare

Just removed the way to preprocess the training data that was added in the previous versions and keep a minor bugfix.
The reason for removing it was because it just was plain bad and slow.

Full Changelog: v0.6.2...v0.6.3

Intricate v0.6.2

05 Sep 11:38
Compare
Choose a tag to compare

Just a minor bug fix inside of the Model's fit method.

Full Changelog: v0.6.1...v0.6.2

Intricate v0.6.1

04 Sep 00:05
872c356
Compare
Choose a tag to compare

What's Changed

  • make any input and output type possible to be given into the fit method of the Model
  • make it so that there is a parsing function that can be used for preprocessing the training inputs and outputs into matrices of floats as to make it possible for training models like bag of words on a very large training data by just passing into the fit method the texts and converting them through the function

Full Changelog: v0.6.0...v0.6.1

Intricate v0.6.0

02 Sep 21:15
Compare
Choose a tag to compare

What's Changed

  • fix the categorical cross entropy to compute loss and derivatives as it is normally defined to compute
  • fix the sum buffer operation that had a problem when setting the local buffer
  • add the Mean Bias loss function
  • add the Mean Absolute loss function
  • improve Intricate's internal code overall to run better and sometimes faster
  • add the Nesterov Accelerated Gradient optimizer
  • add the Momentum-based optimizer
  • add the Adagrad optimizer

Full Changelog: v0.5.0...v0.6.0

Intricate v0.5.0

26 Aug 22:28
5f63da9
Compare
Choose a tag to compare

What's Changed

  • add a way to print the accuracy and use them after training if calculated
  • add a halting condition in the TrainingOptions that lets you stop the training process before the amount of defined epochs is reached if the certain condition is met, such as a minimum loss or minimum acc.
  • add a verbosity option to print a warning that the training stopped because of the halting condition

Full Changelog: Rv0.4.0...v0.5.0

Intricate v0.4.0

26 Aug 02:38
9926e62
Compare
Choose a tag to compare

What's Changed

  • implement optimizers
  • improve the README a lot
  • implement different methods for layers to compute things instead of just doing it on the back_propagate method
  • improve verbosity much more even with a indicatif progress bar
  • improve Intricate error handling much more instead of panicking in most places
  • improve the way Intricate deals with OpenCL kernels and programs
  • make it so that all of the required Intricate OpenCL programs are compiled when the setup_opencl method is called
  • write documentation for Intricate everywhere as well

Full Changelog: v0.3.2...Rv0.4.0

Intricate v0.3.2

20 Aug 23:35
Compare
Choose a tag to compare

What's Changed

This new update there were mostly just internal things done, some bugs fixed and some new tests written.

  • Fix a bug with the Categorical Cross Entropy because it wasn't really being initialized correctly.
  • Make it so that the programs and kernels are all compiled when OpenCL is setup as to make things much easier to deal with.
  • Write some more docs because of the deny(missing_docs) complaining about missing docs.
  • Fix the deny missing docs in the lib.rs file that was only denying for the lib.rs file instead of for the whole project.
  • Can't remember much more of what I did... lol

Full Changelog: v0.3.1...v0.3.2

Intricate v0.3.1

16 Aug 22:35
Compare
Choose a tag to compare

What's Changed

  • Remove f32/f64 different structs convention and just have f32.
  • Remove all Rayon computations using just Rust operations and start using OpenCL everywhere.
  • Solve several bugs in calculations.
  • Add many more unit tests.
  • Add documentation everywhere, and add a #[deny(missing_docs)] in the lib.rs file to force documentation everywhere.
  • Make all tests pass, even when running using the GPU and the CPU.
  • Create some utilities that help writing code on some places of the crate.
  • Create another crate called intricate-macros that has two macros for now to help creating things like activation layers without having to duplicate code.
  • Remove boxed dyn ...'s as to not have static references to everything and use Enum's instead, the new method in the layers create a instance of that enum by default making the creation of a list of layers and stuff like that much simpler.
  • Because of the removed Box's now Intricate can save models directly instead of having to save layer by layer which was really annoying me before.
  • Now I'm having even more fun than before! 😁👍

Full Changelog: v0.2.2...v0.3.1