I have implemented Logistic Regression for Binary classification from scratch with all three types of gradient descent and also include hyper-paremeter tunning of lamda(regularization_co-efficient) and learning_rate.
===> Binary_Logistic_Regression : is the file that contains the Binary Logistic Regressor
The code is quite generic, and can easily be used to fit on any binary Classifications datasets.
#NOTE: As the tittle suggests, this is for binary class problems only.
#Parameters:
- main_gradient_descent : can accept either of the three values('batch', 'stochastic', 'mini-batch')
- regularizer : can accept either of the two values('l1', 'l2')
- hyper_paremeters_assign : can accept either of the two values('auto-assign', 'self-assign'). When auto-assign is choosen, no need to provide 'lamda_value' and 'lr_value'. They will be choosen by Hyper_Parameter Tuning. When 'self-assign' is choose, 'lamda_value' and 'lr_value' must be provided.
- hyper_parameters_tuning_gradient_descent : can accept either of the three values('batch', 'stochastic', 'mini-batch')
- max_iter : can accept any integer value
- early_stopping : can accept either of the two values(True, False)
- lamda_value : can accept any float value. Should only be given when 'self-assign' is choosen at (3)
- lr_value : can accept any float value. Should only be given when 'self-assign' is choosen at (3)
- monitor : can accept either of the two values('val_error', 'val_accuracy')
- paitiance : can accept any integer value
- error_roundoff : can accept any integer value
- acc_roundoff : can accpet any integer value
- acc_change : can accept any numerical value(be it integer or float)
- error_change : can accept any numerical valuee(be it integer or float)
- verbose : can accept either of the two values(True, False)
Libraries used inside the Binary Logistic Regression File:
==> Sci-kit Learn: For train_test_splits, confusion_matrix, classification_report and accuracy report
==> Numpy: Since numpy calculations are faster, so all calculations are done with numpy
==> Random: To intialize weights randomly at beginning
==> OS: Used when you try to use the save model functionality.