-
Notifications
You must be signed in to change notification settings - Fork 0
Home
Calvin edited this page May 6, 2022
·
16 revisions
The novelty of the CDST package comes in two-folds:
- Architectural Hyperparameter (AH) for Neural Network
- Reduced/Aggregated Output Resolution Regression
CDST contains a set of pre-designed deep learning algorithms based on Pytorch, these include:
- General Scalable Deep Learning Fully Connected Network (DNN)
- Calvin's Scalable Parallel Downsampler (CSPD)
- Ordinal Hyperplane Loss Classifier (OHPL) (Reference publication)
The Architectural Hyperparameter (AH) sampling can be used for all of the above algorithms and the Reduced/Aggregated Output Resolution Regression is designed specifically for CSPD. Demo Jupyter notebook are provided for applying these pre-designed deep learning algorithms with ready to use data object and ready to use data partitioning methods for performing K-fold cross-validation over hyperparameter sets. Using these pre-designed object and methods, the hyperparameter tuning can be done using the hyperparameter tuning library Ray Tune and evaluated statistically.