WWW'24, Mirror Gradient (MG) makes multimodal recommendation models approach flat local minima easier compared to models with normal training.
-
Updated
Nov 1, 2024 - Python
WWW'24, Mirror Gradient (MG) makes multimodal recommendation models approach flat local minima easier compared to models with normal training.
Optimisation and algorithm project. I) L1, L2, and L2^2 regularisers in optimal trajectory synthesis; II) Logistic data classification; III) Gradient methods.
Gradient method, Newton method, Euler and Heun method, discrete Fourier transform and Monte Carlo simulation in C++
This repository contains Python codes for various optimization and modeling tasks, including inheritance partitioning, surface and contour plotting, warehouse location optimization, root-finding using Bisection and Golden Section methods, gradient descent with different step-size strategies, inertial gradient methods, and image inpainting
Implementation of unconstrained and constrained convex optimization algorithms in Python, focusing on solving data science problems such as semi-supervised learning and Support Vector Machines.
This repository contains lecture notes for the Optimization Theory course. The course covers fundamental concepts of optimization, classification of optimization problems, graphical methods, classical optimization techniques, constrained and unconstrained optimization, linear and nonlinear programming, and gradient methods.
AutoSGM
An easy implementation of the Stochastic / Batch gradient descent and comparison with the standard Gradient Descent Method
Unofficial implementation of the Accelerated ExtraGradient method from the 2022 paper "Optimal Gradient Sliding and its Application to Distributed Optimization Under Similarity" by Kovalev D. et al.
Add a description, image, and links to the gradient-method topic page so that developers can more easily learn about it.
To associate your repository with the gradient-method topic, visit your repo's landing page and select "manage topics."