Skip to content
/ LST-2d Public

LST - alternative to fullyconnected layer for NN. Key feature: reduced number of parameters

License

Notifications You must be signed in to change notification settings

Mak-Sim/LST-2d

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Learned 2D Separable Transform

The repository contains source code for learned two-dimensional separable transform (LST) that can be considered as a new type of computational layer for constructing neural network (NN) architecture for image recognition tasks. The LST based on the idea of sharing the weights of one fullyconnected (FC) layer to process all rows of an image.

LST-2d

Key feature

The use of LST layers in a NN architecture significantly reduces the number of model parameters compared to models that use stacked FC layers. We show that a NN-classifier based on a single LST layer followed by an FC layer achieves 98.02% accuracy on the MNIST dataset, while having only 9.5k parameters.

Slides (in Russian)

Source

In src folder you can find PyTorch implementation of LST layer and three NNs: LST-1, LST-2 and ResLST-3.

Cite

M. Vashkevich, E. Krivalcevich Compact and Efficient Neural Networks for Image Recognition Based on Learned 2D Separable Transform in Proc. 27th International Conference on Digital Signal Processing and its Applications (DSPA), 2025, P. 1–6.

About

LST - alternative to fullyconnected layer for NN. Key feature: reduced number of parameters

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published