Skip to content

The LRP Toolbox provides simple and accessible stand-alone implementations of LRP for artificial neural networks supporting Matlab and Python. The Toolbox realizes LRP functionality for the Caffe Deep Learning Framework as an extension of Caffe source code published in 10/2015.

License

Notifications You must be signed in to change notification settings

jackmcrider/lrp_toolbox

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

The LRP Toolbox for Artificial Neural Networks (1.2.0)

The Layer-wise Relevance Propagation (LRP) algorithm explains a classifer's prediction specific to a given data point by attributing relevance scores to important components of the input by using the topology of the learned model itself.

The LRP Toolbox provides simple and accessible stand-alone implementations of LRP for artificial neural networks supporting Matlab and python. The Toolbox realizes LRP functionality for the Caffe Deep Learning Framework as an extension of Caffe source code published in 10/2015.

The implementations for Matlab and python are intended as a sandbox or playground to familiarize the user to the LRP algorithm and thus are implemented with readability and transparency in mind. Models and data can be imported and exported using raw text formats, Matlab's .mat files and the .npy format for python/numpy.

See the LRP Toolbox in Action

To try out either the python-based MNIST demo, or the Caffe based ImageNet demo in your browser, click on the respective panels:

New in version 1.2.0

The standalone implementations for python and Matlab:

  • Convnets with Sum- and Maxpooling are now supported, including demo code.
  • LRP-parameters can now be set for each layer individually
  • w² and flat weight decomposition implemented.

Caffe:

  • Minimal output versions implemented.
  • Matthew Zeiler et al.'s Deconvolution, Karen Simonyan et al.'s Sensitivity Maps, and aspects of Grégoire Montavon et al.'s Deep Taylor Decomposition are implemented, alongside the flat weight decomposition for uniformly projecting relevance scores to a neuron's receptive field have been implemented.

Also:

  • Various optimizations, refactoring, bits and pieces here and there.

Obtaining the LRP Toolbox:

You can directly download the latest full release / current verson from github. However, if you prefer to only download what is necessary for your project/language/purpose, make use of the pre-packaged downloads available at heatmapping.org

Installing the Toolbox:

After having obtained the toolbox code, data and models of choice, simply move into the subpackage folder of you choice -- matlab, python or caffe-master-lrp -- and execute the installation script (written for Ubuntu 14.04 or newer).

<obtain the toolbox>
cd lrp_toolbox/$yourChoice
bash install.sh

Make sure to at least skim through the installation scripts! For more details and instructions please refer to the manual.

The LRP Toolbox Paper

When using (any part) of this toolbox, please cite our paper

@article{JMLR:v17:15-618,
    author  = {Sebastian Lapuschkin and Alexander Binder and Gr{{\'e}}goire Montavon and Klaus-Robert M{{{\"u}}}ller and Wojciech Samek},
    title   = {The LRP Toolbox for Artificial Neural Networks},
    journal = {Journal of Machine Learning Research},
    year    = {2016},
    volume  = {17},
    number  = {114},
    pages   = {1-5},
    url     = {http://jmlr.org/papers/v17/15-618.html}
}

Misc

For further research and projects involving LRP, visit heatmapping.org

About

The LRP Toolbox provides simple and accessible stand-alone implementations of LRP for artificial neural networks supporting Matlab and Python. The Toolbox realizes LRP functionality for the Caffe Deep Learning Framework as an extension of Caffe source code published in 10/2015.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Jupyter Notebook 47.3%
  • C++ 40.2%
  • Python 5.4%
  • Cuda 2.2%
  • MATLAB 1.9%
  • CMake 1.4%
  • Other 1.6%