Submodule | Maintainers | Contact Info |
---|---|---|
gelu | @AakashKumarNain | aakashnain@outlook.com |
maxout | @fsx950223 | fsx950223@gmail.com |
normalizations | @smokrow | moritz.kroeger@tu-dortmund.de |
opticalflow | @fsx950223 | fsx950223@gmail.com |
poincare | @rahulunair | rahulunair@gmail.com |
polynomial | @tanzheny | tanzheny@google.com |
sparsemax | @AndreasMadsen | amwwebdk+github@gmail.com |
tlu | @AakashKumarNain | aakashnain@outlook.com |
wrappers | @seanpmorgan | seanmorgan@outlook.com |
Submodule | Layer | Reference |
---|---|---|
gelu | GELU | https://arxiv.org/abs/1606.08415 |
maxout | Maxout | https://arxiv.org/abs/1302.4389 |
normalizations | GroupNormalization | https://arxiv.org/abs/1803.08494 |
normalizations | InstanceNormalization | https://arxiv.org/abs/1607.08022 |
opticalflow | CorrelationCost | https://arxiv.org/abs/1504.06852 |
poincare | PoincareNormalize | https://arxiv.org/abs/1705.08039 |
polynomial | PolynomialCrossing | https://arxiv.org/pdf/1708.05123 |
sparsemax | Sparsemax | https://arxiv.org/abs/1602.02068 |
tlu | TLU | https://arxiv.org/abs/1911.09737 |
wrappers | WeightNormalization | https://arxiv.org/abs/1602.07868 |
In order to conform with the current API standard, all layers must:
- Inherit from either
keras.layers.Layer
or its subclasses. - Register as a keras global object so it can be serialized properly:
@tf.keras.utils.register_keras_serializable(package='Addons')
- Add the addon to the
py_library
in this sub-package's BUILD file.
- Simple unittests that demonstrate the layer is behaving as expected.
- When applicable, run all unittests with TensorFlow's
@run_in_graph_and_eager_modes
(for test method) orrun_all_in_graph_and_eager_modes
(for TestCase subclass) decorator. - Run
layer_test
on the layer. - Add a
py_test
to this sub-package's BUILD file.
- Update the table of contents in this sub-package's README.