This is a lightweight JavaScript library for implementing various machine learning algorithms. The library includes the following algorithms:
- Linear Regression
- Logistic Regression
- Support Vector Machine (SVM)
- Decision Tree
- Random Forest
- XGBoost
The library is designed to be modular, easy to use, and extensible. Below is the documentation for installation, usage, and examples.
To use this library, you need to have Node.js
and npm
installed. Follow these steps:
-
Clone the repository or download the source code.
-
Navigate to the project directory.
-
Install the required dependencies:
npm install mathjs
-
Import the library into your project:
const { Model } = require('argonz-ml');
The library provides a unified Model
class to instantiate and use any of the supported algorithms. Below are the details for each algorithm.
- Description: A linear regression model with Ridge regularization.
- Options:
lambda
(default:0.01
): Regularization parameter.
const model = new Model('linear_regression', { lambda: 0.01 });
- Description: A logistic regression model for binary classification.
- Options:
learningRate
(default:0.01
): Learning rate for gradient descent.iterations
(default:1000
): Number of training iterations.
const model = new Model('logistic_regression', { learningRate: 0.01, iterations: 1000 });
- Description: A Support Vector Machine (SVM) for binary classification.
- Options:
learningRate
(default:0.01
): Learning rate for gradient descent.lambda
(default:0.1
): Regularization parameter.iterations
(default:1000
): Number of training iterations.
const model = new Model('svm', { learningRate: 0.01, lambda: 0.1, iterations: 1000 });
- Description: A decision tree for classification or regression.
- Options:
maxDepth
(default:5
): Maximum depth of the tree.
const model = new Model('decision_tree', { maxDepth: 5 });
- Description: A random forest ensemble of decision trees.
- Options:
nTrees
(default:10
): Number of trees in the forest.maxDepth
(default:5
): Maximum depth of each tree.featureSubsetSize
(default:sqrt(nFeatures)
): Number of features to consider for each split.
const model = new Model('random_forest', { nTrees: 10, maxDepth: 5 });
- Description: An implementation of the XGBoost algorithm for regression.
- Options:
nTrees
(default:10
): Number of trees.learningRate
(default:0.1
): Learning rate.maxDepth
(default:3
): Maximum depth of each tree.lambda
(default:1.0
): Regularization parameter.
const model = new Model('xgboost', { nTrees: 10, learningRate: 0.1, maxDepth: 3, lambda: 1.0 });
The Model
class provides a unified interface for training, predicting, and evaluating models.
-
train(X, y)
:- Trains the model on the input data
X
and target labelsy
.
- Trains the model on the input data
-
predict(X)
:- Predicts the target values for the input data
X
.
- Predicts the target values for the input data
-
evaluate(yTrue, yPred)
:- Evaluates the model's performance by comparing true labels
yTrue
with predicted labelsyPred
.
- Evaluates the model's performance by comparing true labels
const { Model } = require('argonz-ml');
// Example data
const X = [
[1, 2, 3],
[4, 5, 6],
[7, 8, 9],
[10, 11, 12]
];
const y = [10, 20, 30, 40];
// Create and train the model
const model = new Model('linear_regression', { lambda: 0.01 });
model.train(X, y);
// Make predictions
const predictions = model.predict(X);
// Evaluate the model
const evaluation = model.evaluate(y, predictions);
console.log("Predictions:", predictions);
console.log("Evaluation:", evaluation);
const { Model } = require('argonz-ml');
// Example data
const X = [
[1, 2, 3],
[4, 5, 6],
[7, 8, 9],
[10, 11, 12]
];
const y = [0, 1, 0, 1]; // Binary classification
// Create and train the model
const model = new Model('logistic_regression', { learningRate: 0.01, iterations: 1000 });
model.train(X, y);
// Make predictions
const predictions = model.predict(X);
// Evaluate the model
const evaluation = model.evaluate(y, predictions);
console.log("Predictions:", predictions);
console.log("Evaluation:", evaluation);
const { Model } = require('argonz-ml');
// Example data
const X = [
[1, 2, 3],
[4, 5, 6],
[7, 8, 9],
[10, 11, 12]
];
const y = [0, 1, 0, 1]; // Binary classification
// Create and train the model
const model = new Model('random_forest', { nTrees: 10, maxDepth: 5 });
model.train(X, y);
// Make predictions
const predictions = model.predict(X);
// Evaluate the model
const evaluation = model.evaluate(y, predictions);
console.log("Predictions:", predictions);
console.log("Evaluation:", evaluation);
-
Save the code in a file, e.g.,
example.js
. -
Run the file using Node.js:
node example.js
-
You should see the predictions and evaluation metrics printed in the console.
This project is licensed under the MIT License. See the LICENSE file for details.
Contributions are welcome! Please open an issue or submit a pull request for any improvements or bug fixes.
For questions or feedback, please contact Argonz Company.
Enjoy using the Machine Learning Library! 🚀