Skip to main content
King Abdullah University of Science and Technology
Scientific Computing and Machine Learning
SCML
Scientific Computing and Machine Learning
Main navigation
  • Home
  • People
    • Principal Investigators
    • Research Scientists
    • Postdoctoral Fellows
    • Students
    • All Profiles
  • Events
    • All Events
    • Events Calendar
  • News
  • KAUST Innovation Hub in Shenzhen
  • Opportunities

Regression Models

Unveiling Insights from "Gradient Descent Converges Linearly for Logistic Regression on Separable Data"

Bang An, Ph.D. Student, Applied Mathematics and Computational Sciences
Jan 17, 10:00 - 11:00

B1 L0 R0118

gradient methods Regression Models

Abstract In this presentation, I will share a paper titled "Gradient Descent Converges Linearly for Logistic Regression on Separable Data", a work highly related to my ongoing research. I will explore its relevance to my current research topic and discuss the inspiration for our future works. Abstract of the paper: We show that running gradient descent with variable learning rate guarantees loss f(x) \leq 1.1f(x^*)+\epsilon for the logistic regression objective, where the error \epsilon decays exponentially with the number of iterations and polynomially with the magnitude of the entries of an

Scientific Computing and Machine Learning (SCML)

Footer

  • A-Z Directory
    • All Content
    • Browse Related Sites
  • Site Management
    • Log in

© 2025 King Abdullah University of Science and Technology. All rights reserved. Privacy Notice