Releases: joaopauloschuler/neural-api
NLP Support, New Layers, Faster Memory Access, New Examples
CAI Neural API v2.0.0 Release Notes
New Source Code Examples
Several new examples to help you get started with various neural network applications:
- Malaria Cell Infection Image Classification
- Colorectal Cancer Image Classification
- Plant Leaf Disease Image Classification for the PlantVillage Dataset
- Sentiment Analysis
- NLP Support: Tokenizer, Samplers, Transformer Decoder
- Pre-trained Models
- GPT-3 Small
New Layers
Enhance your neural networks with the following new layers:
- TNNetPadXY
- TNNetCrop
- TNNetMaxPoolWithPosition
- TNNetTransposeXD
- TNNetTransposeYD
- TNNetDotProducts
- TNNetEmbedding
- TNNetAddPositionalEmbedding
- TNNetTokenAndPositionalEmbedding
- TNNetChannelNorm
- TNNetSignedSquareRoot
- TNNetSignedSquareRoot1
- TNNetSignedSquareRootN
- TNNetReLUP
- TNNetPointwiseNorm
- TNNetPointwiseSoftMax
- TNNet.AddSelfAttentionCAI
- TNNet.AddTransformerBlockCAI
Other New Features
Enhancements and new functionalities introduced in this release:
- Image Support:
- Added TIFF image support.
- Classification Enhancements:
- Added
TNeuralFitWithImageBase.ClassifyImageFromFile
.
- Added
- Data Handling:
- Added
TStringStringList.LoadFromCsv
andTStringStringList.SaveToCsv
. - Added
TVolume.OneHotEncoding(aTokens: array of integer)
,TVolume.OneHotEncoding(aTokens: string)
, andTVolume.OneHotEncodingReversed(aTokens: string)
.
- Added
- Neural Network Enhancements:
- Added
TNNetNeuron.Bias
property.
- Added
- Volume Operations:
- Debugging and Logging:
- Added OpenCL debug status.
- Training Enhancements:
- Added Adam Optimizer:
TNeuralOptimizerAdam
. - Added the option to save the neural network when the best loss is found (commit details).
- Added MinBackpropagationErrorProportion fitting property.
- Added
TNeuralFitBase.LogEveryBatches
to control log frequency.
- Added Adam Optimizer:
- Dependencies:
- Removed MTPCPU dependency.
- NLP:
Enhancements and Fixes
- Stability Improvements:
- Fitting (training) is now a lot more stable.
- Memory Optimizations:
- Implemented numerous memory optimizations.
- Enhanced memory-efficient grouped pointwise convolutions.
- Bug Fixes:
- Various bugs have been fixed.
- Documentation:
- Improved documentation and added more comprehensive source code comments.
- Added plenty of YouTube Videos for better learning and implementation guidance.
Thank you for using the Pascal based Neural API! For any questions or feedback, please visit the GitHub repository.
New examples, hard swish activation function, web server and more source code comments
New examples for beginners:
- One-neuron example that learns 2x - 3y + 4.
- One-neuron example that OR logic operation.
- Example for XOR logic function.
- Example for non linear function.
- Easy Delphi example for a quick start.
For advanced users:
- New hard swish activation function.
- New web server example.
This release also includes some bug fixes, better support for grouped convolutions, better documentation and source code comments.
Faster backpropagation!
Updates in this new release v1.0.7 are:
- CIFAR-10 Resized program has been added to the main readme file. It’s a program that resizes CIFAR-10 and CIFAR-100 images to 64x64 and 128x128 pixels.
- #77: Added results for the ResNet example.
- #78: Added TNNetLayer.ForcePositiveWeights.
- #79: Updated hypotenuse example.
- #81: Added FlipY with MultipleSamplesAtValidation.
- #84: Faster backpropagation by not backpropagating small values.
ResNet, More Training, Less Validation and Loads Best Epoch
Variable Sample Sizes on CIFAR10 + Swish6 Activation Function
This release has:
- You can now define the validation sample size when loading CIFAR-10 and CIFAR-100: #73 . By tweaking this number, you may find bigger test accuracies when training your models.
- There is a new activation function TNNetSwish6. This activation function is a fantastic replacement for ReLU6 as it gives higher accuracy while maintaining numerical stability.
ChannelShiftRate Data Augmentation Property
This release implements new data augmentation fitting properties: TNeuralImageFit.ChannelShiftRate and TNeuralDataLoadingFit.ChannelShiftRate .
This release also fixes #70 - Using HasImgCrop breaks execution.
More Layer Types! Better Sampling!
New Layers:
- TNNetChannelMulByLayer.
- TNNetCellMulByLayer.
- TNNetInterleaveChannels.
New Activation Functions:
API Additions:
- TNNetVolumeList.FillTag.
- TFileNameList.
- TNeuralImageLoadingFit #49.
- TNeuralThreadList.CalculateWorkingRange.
- TNeuralThreadList.GetRandomNumberOnWorkingRange.
- TVolume.GetValueCount.
- TVolume.GetSmallestIdxInRange.
- TNNetVolumeList.AddValue.
- TNNetVolumeList.Divi.
- TNNetNeuronList.GetMaxAbsWeight.
Other Improvements:
Code optimizations for non X86 processors
Fixes #39 and #41
Version 1.0!
After a long long time coding and testing, it's time to call it version 1.0! YAY!
Really comprehensive testing has been made. The last batch of testing came from a Generative Adversarial Network Testing that helped to debug a lot.
These are some of the images produces with this testing:
These are the main changes since the last release:
- We have a new convolutional layer type: TNNetConvolutionSharedWeights. Instead of having its own neurons and weights, this convolutional layer uses the same neurons and weights from another existing layer. So, if you need 2 layers with the same neurons, you can add TNNetConvolutionSharedWeights to your network. Why would you need something like this? Maybe, your NN needs to learn the same patterns in different scales (such as big and small dogs are all dogs). You can follow this example.
- Adding FitLoading Example.
- Documentation has been updated: #30 and #33.
- Fixes bugs #35 and #36.
- Added method CountNeurons.
- TNNetDeMaxPool optimization.
There are still open bugs. But this is how life is.
I wish everyone happy pascal coding and to live long and prosper!