gd-takes-the-shortest-path-exp-implementation

Implementation of the experiment mentioned in "Overparameterized Nonlinear Learning: Gradient Descent Takes the Shortest Path?" paper by Samet Oymak and Mahdi Soltanolkotabi using PyTorch

https://github.com/umityigitbsrn/gd-takes-the-shortest-path-exp-implementation

Science Score: 44.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (3.4%) to scientific vocabulary
Last synced: 6 months ago · JSON representation ·

Repository

Implementation of the experiment mentioned in "Overparameterized Nonlinear Learning: Gradient Descent Takes the Shortest Path?" paper by Samet Oymak and Mahdi Soltanolkotabi using PyTorch

Basic Info
  • Host: GitHub
  • Owner: umityigitbsrn
  • License: mit
  • Language: Jupyter Notebook
  • Default Branch: main
  • Size: 1.18 MB
Statistics
  • Stars: 0
  • Watchers: 1
  • Forks: 0
  • Open Issues: 0
  • Releases: 0
Created almost 3 years ago · Last pushed almost 3 years ago
Metadata Files
Readme License Citation

README.md

Overparameterized Nonlinear Learning: Gradient Descent Takes the Shortest Path?

Implementation of the experiment mentioned in "Overparameterized Nonlinear Learning: Gradient Descent Takes the Shortest Path?" paper by Samet Oymak and Mahdi Soltanolkotabi using PyTorch

MNIST Experiment

Implementation of this experiment is under 'mnist_experiment' folder

The experiment setup is taken from the page 12-13 (under 6.2 MNIST Experiment)

To save the resulting JSON files 'misfitdistance500json', 'misfitdistance5000json' folders need to be created

Low-rank Regression

Implementation of this experiment is under 'lowrankregression' folder

Gradient for the loss function of low rank regression can be mentioned in the paper on page 32 (in Appendix). The implementation will be based on this notation

The experiment setup is taken from the page 13 (under 6.2 Low-rank Regression)

To save the resulting JSON files 'misfitdistancejson' folder need to be created

Owner

  • Name: Ümit Yiğit Başaran
  • Login: umityigitbsrn
  • Kind: user
  • Location: Riverside, CA

Citation (CITATION.cff)

preferred-citation:
  type: article
  authors:
  - family-names: "Samet"
    given-names: "Oymak"
  - family-names: "Mahdi"
    given-names: "Soltanolkotabi"
  journal: "PMLR"
  month: 5
  title: "Overparameterized Nonlinear Learning: Gradient Descent Takes the Shortest Path?"
  year: 2019

GitHub Events

Total
  • Public event: 1
Last Year
  • Public event: 1