Build and train Lipschitz constrained networks: TensorFlow implementation of k-Lipschitz layers
Controlling the Lipschitz constant of a layer or a whole neural network has many applications ranging from adversarial robustness to Wasserstein distance estimation.
This library provides an efficient implementation of k-Lispchitz layers for keras.
You can install deel-lip
directly from pypi:
pip install deel-lip
In order to use deel-lip
, you also need a valid tensorflow
installation. deel-lip
supports tensorflow versions 2.x.
Dense
, Conv2D
and
Pooling
,keras
,keras
,To contribute, you can open an issue, or fork this repository and then submit changes through a pull-request. We use black to format the code and follow PEP-8 convention. To check that your code will pass the lint-checks, you can run:
tox -e py36-lint
You need tox
in order to
run this. You can install it via pip
:
pip install tox
More from the DEEL project:
This library has been built to support the work presented in the paper Achieving robustness in classification using optimaltransport with Hinge regularization which aim provable and efficient robustness by design.
This work can be cited as:
@misc{2006.06520,
Author = {Mathieu Serrurier and Franck Mamalet and Alberto GonzΓ‘lez-Sanz and Thibaut Boissin and Jean-Michel Loubes and Eustasio del Barrio},
Title = {Achieving robustness in classification using optimal transport with hinge regularization},
Year = {2020},
Eprint = {arXiv:2006.06520},
}
The package is released under MIT license.